Sep 29 09:47:48 crc systemd[1]: Starting Kubernetes Kubelet... Sep 29 09:47:48 crc restorecon[4667]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:48 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:47:49 crc restorecon[4667]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 29 09:47:50 crc kubenswrapper[4891]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.166076 4891 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174161 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174199 4891 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174206 4891 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174215 4891 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174222 4891 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174229 4891 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174238 4891 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174247 4891 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174253 4891 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174259 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174266 4891 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174276 4891 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174283 4891 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174289 4891 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174295 4891 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174302 4891 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174309 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174315 4891 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174321 4891 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174328 4891 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174335 4891 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174343 4891 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174350 4891 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174357 4891 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174365 4891 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174372 4891 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174379 4891 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174386 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174393 4891 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174401 4891 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174408 4891 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174415 4891 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174422 4891 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174428 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174434 4891 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174442 4891 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174449 4891 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174455 4891 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174462 4891 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174468 4891 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174475 4891 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174481 4891 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174487 4891 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174493 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174500 4891 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174506 4891 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174511 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174516 4891 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174524 4891 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174531 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174537 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174542 4891 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174549 4891 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174558 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174563 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174569 4891 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174574 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174579 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174584 4891 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174589 4891 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174595 4891 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174600 4891 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174605 4891 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174609 4891 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174614 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174620 4891 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174625 4891 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174630 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174635 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174640 4891 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.174646 4891 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175620 4891 flags.go:64] FLAG: --address="0.0.0.0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175643 4891 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175655 4891 flags.go:64] FLAG: --anonymous-auth="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175664 4891 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175672 4891 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175679 4891 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175689 4891 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175698 4891 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175705 4891 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175712 4891 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175720 4891 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175728 4891 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175737 4891 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175744 4891 flags.go:64] FLAG: --cgroup-root="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175750 4891 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175757 4891 flags.go:64] FLAG: --client-ca-file="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175763 4891 flags.go:64] FLAG: --cloud-config="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175770 4891 flags.go:64] FLAG: --cloud-provider="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175776 4891 flags.go:64] FLAG: --cluster-dns="[]" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175812 4891 flags.go:64] FLAG: --cluster-domain="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175820 4891 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175827 4891 flags.go:64] FLAG: --config-dir="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175835 4891 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175842 4891 flags.go:64] FLAG: --container-log-max-files="5" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175851 4891 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175858 4891 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175865 4891 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175871 4891 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175878 4891 flags.go:64] FLAG: --contention-profiling="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175884 4891 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175890 4891 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175896 4891 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175903 4891 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175911 4891 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175917 4891 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175923 4891 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175929 4891 flags.go:64] FLAG: --enable-load-reader="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175936 4891 flags.go:64] FLAG: --enable-server="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175942 4891 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175951 4891 flags.go:64] FLAG: --event-burst="100" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175958 4891 flags.go:64] FLAG: --event-qps="50" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175966 4891 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175973 4891 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.175989 4891 flags.go:64] FLAG: --eviction-hard="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176002 4891 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176010 4891 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176017 4891 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176024 4891 flags.go:64] FLAG: --eviction-soft="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176032 4891 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176038 4891 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176044 4891 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176050 4891 flags.go:64] FLAG: --experimental-mounter-path="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176056 4891 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176063 4891 flags.go:64] FLAG: --fail-swap-on="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176069 4891 flags.go:64] FLAG: --feature-gates="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176083 4891 flags.go:64] FLAG: --file-check-frequency="20s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176090 4891 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176096 4891 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176105 4891 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176112 4891 flags.go:64] FLAG: --healthz-port="10248" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176118 4891 flags.go:64] FLAG: --help="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176124 4891 flags.go:64] FLAG: --hostname-override="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176130 4891 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176137 4891 flags.go:64] FLAG: --http-check-frequency="20s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176143 4891 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176150 4891 flags.go:64] FLAG: --image-credential-provider-config="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176156 4891 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176163 4891 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176169 4891 flags.go:64] FLAG: --image-service-endpoint="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176174 4891 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176180 4891 flags.go:64] FLAG: --kube-api-burst="100" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176186 4891 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176192 4891 flags.go:64] FLAG: --kube-api-qps="50" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176198 4891 flags.go:64] FLAG: --kube-reserved="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176204 4891 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176210 4891 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176217 4891 flags.go:64] FLAG: --kubelet-cgroups="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176222 4891 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176228 4891 flags.go:64] FLAG: --lock-file="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176234 4891 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176240 4891 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176246 4891 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176256 4891 flags.go:64] FLAG: --log-json-split-stream="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176262 4891 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176269 4891 flags.go:64] FLAG: --log-text-split-stream="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176275 4891 flags.go:64] FLAG: --logging-format="text" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176281 4891 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176287 4891 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176293 4891 flags.go:64] FLAG: --manifest-url="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176300 4891 flags.go:64] FLAG: --manifest-url-header="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176309 4891 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176316 4891 flags.go:64] FLAG: --max-open-files="1000000" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176324 4891 flags.go:64] FLAG: --max-pods="110" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176331 4891 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176338 4891 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176344 4891 flags.go:64] FLAG: --memory-manager-policy="None" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176352 4891 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176359 4891 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176366 4891 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176373 4891 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176389 4891 flags.go:64] FLAG: --node-status-max-images="50" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176395 4891 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176401 4891 flags.go:64] FLAG: --oom-score-adj="-999" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176407 4891 flags.go:64] FLAG: --pod-cidr="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176414 4891 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176425 4891 flags.go:64] FLAG: --pod-manifest-path="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176431 4891 flags.go:64] FLAG: --pod-max-pids="-1" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176437 4891 flags.go:64] FLAG: --pods-per-core="0" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176443 4891 flags.go:64] FLAG: --port="10250" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176449 4891 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176456 4891 flags.go:64] FLAG: --provider-id="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176462 4891 flags.go:64] FLAG: --qos-reserved="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176468 4891 flags.go:64] FLAG: --read-only-port="10255" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176473 4891 flags.go:64] FLAG: --register-node="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176479 4891 flags.go:64] FLAG: --register-schedulable="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176486 4891 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176496 4891 flags.go:64] FLAG: --registry-burst="10" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176502 4891 flags.go:64] FLAG: --registry-qps="5" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176508 4891 flags.go:64] FLAG: --reserved-cpus="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176514 4891 flags.go:64] FLAG: --reserved-memory="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176523 4891 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176529 4891 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176535 4891 flags.go:64] FLAG: --rotate-certificates="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176542 4891 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176548 4891 flags.go:64] FLAG: --runonce="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176554 4891 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176560 4891 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176567 4891 flags.go:64] FLAG: --seccomp-default="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176572 4891 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176579 4891 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176586 4891 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176592 4891 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176598 4891 flags.go:64] FLAG: --storage-driver-password="root" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176604 4891 flags.go:64] FLAG: --storage-driver-secure="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176610 4891 flags.go:64] FLAG: --storage-driver-table="stats" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176616 4891 flags.go:64] FLAG: --storage-driver-user="root" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176622 4891 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176628 4891 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176635 4891 flags.go:64] FLAG: --system-cgroups="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176640 4891 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176649 4891 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176655 4891 flags.go:64] FLAG: --tls-cert-file="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176661 4891 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176669 4891 flags.go:64] FLAG: --tls-min-version="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176676 4891 flags.go:64] FLAG: --tls-private-key-file="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176682 4891 flags.go:64] FLAG: --topology-manager-policy="none" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176688 4891 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176695 4891 flags.go:64] FLAG: --topology-manager-scope="container" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176701 4891 flags.go:64] FLAG: --v="2" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176710 4891 flags.go:64] FLAG: --version="false" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176718 4891 flags.go:64] FLAG: --vmodule="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176725 4891 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.176732 4891 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176896 4891 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176904 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176911 4891 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176918 4891 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176924 4891 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176930 4891 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176935 4891 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176943 4891 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176951 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176958 4891 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176964 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176971 4891 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176977 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176984 4891 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176991 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.176998 4891 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177005 4891 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177015 4891 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177022 4891 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177029 4891 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177035 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177042 4891 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177047 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177053 4891 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177059 4891 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177064 4891 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177070 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177075 4891 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177081 4891 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177086 4891 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177092 4891 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177097 4891 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177103 4891 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177108 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177115 4891 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177122 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177127 4891 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177132 4891 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177137 4891 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177143 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177150 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177156 4891 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177161 4891 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177166 4891 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177171 4891 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177176 4891 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177182 4891 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177188 4891 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177196 4891 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177202 4891 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177209 4891 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177215 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177221 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177226 4891 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177232 4891 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177237 4891 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177242 4891 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177247 4891 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177253 4891 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177258 4891 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177263 4891 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177269 4891 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177274 4891 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177279 4891 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177284 4891 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177289 4891 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177295 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177301 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177307 4891 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177312 4891 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.177318 4891 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.177927 4891 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.191727 4891 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.191770 4891 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191876 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191886 4891 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191890 4891 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191895 4891 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191899 4891 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191903 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191907 4891 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191911 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191915 4891 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191919 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191922 4891 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191926 4891 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191929 4891 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191934 4891 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191940 4891 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191944 4891 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191948 4891 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191952 4891 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191956 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191960 4891 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191964 4891 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191970 4891 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191974 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191980 4891 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191984 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191988 4891 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191993 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.191997 4891 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192000 4891 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192004 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192008 4891 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192013 4891 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192017 4891 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192022 4891 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192026 4891 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192030 4891 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192033 4891 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192036 4891 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192040 4891 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192043 4891 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192047 4891 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192051 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192054 4891 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192058 4891 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192061 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192064 4891 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192068 4891 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192071 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192075 4891 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192078 4891 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192082 4891 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192086 4891 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192091 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192095 4891 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192099 4891 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192103 4891 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192107 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192112 4891 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192116 4891 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192120 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192124 4891 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192128 4891 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192132 4891 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192135 4891 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192139 4891 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192143 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192146 4891 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192150 4891 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192153 4891 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192157 4891 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192160 4891 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.192168 4891 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192327 4891 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192336 4891 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192341 4891 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192345 4891 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192348 4891 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192352 4891 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192355 4891 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192359 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192363 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192366 4891 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192370 4891 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192373 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192376 4891 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192380 4891 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192384 4891 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192387 4891 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192391 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192394 4891 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192399 4891 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192404 4891 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192407 4891 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192412 4891 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192415 4891 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192418 4891 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192423 4891 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192426 4891 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192431 4891 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192434 4891 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192438 4891 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192441 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192445 4891 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192448 4891 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192452 4891 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192455 4891 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192459 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192462 4891 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192465 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192470 4891 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192474 4891 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192479 4891 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192483 4891 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192488 4891 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192493 4891 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192498 4891 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192502 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192506 4891 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192510 4891 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192514 4891 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192518 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192521 4891 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192525 4891 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192529 4891 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192533 4891 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192537 4891 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192540 4891 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192544 4891 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192548 4891 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192552 4891 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192556 4891 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192559 4891 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192564 4891 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192569 4891 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192573 4891 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192578 4891 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192583 4891 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192587 4891 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192591 4891 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192595 4891 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192600 4891 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192604 4891 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.192608 4891 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.192613 4891 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.193608 4891 server.go:940] "Client rotation is on, will bootstrap in background" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.197330 4891 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.197423 4891 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.198748 4891 server.go:997] "Starting client certificate rotation" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.198769 4891 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.199856 4891 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-14 05:10:28.605282348 +0000 UTC Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.199963 4891 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1819h22m38.405321574s for next certificate rotation Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.232770 4891 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.238080 4891 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.258307 4891 log.go:25] "Validated CRI v1 runtime API" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.291351 4891 log.go:25] "Validated CRI v1 image API" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.293161 4891 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.298970 4891 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-29-09-43-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.299020 4891 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.316629 4891 manager.go:217] Machine: {Timestamp:2025-09-29 09:47:50.314022892 +0000 UTC m=+0.519191233 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4df1a8e1-540a-4a1c-bb3d-0ff769533bc7 BootID:37d954c2-3d94-47a1-be48-2d150f56c63a Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:73:05:92 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:73:05:92 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d3:c6:b1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3d:a4:c8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3a:08:6a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:19:71:b3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:fb:ba:f8:ef:34 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:a6:67:cc:4d:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.317277 4891 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.317527 4891 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.317982 4891 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.318279 4891 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.318399 4891 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.318742 4891 topology_manager.go:138] "Creating topology manager with none policy" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.318845 4891 container_manager_linux.go:303] "Creating device plugin manager" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.319421 4891 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.319509 4891 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.320301 4891 state_mem.go:36] "Initialized new in-memory state store" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.320514 4891 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.323434 4891 kubelet.go:418] "Attempting to sync node with API server" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.323515 4891 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.323590 4891 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.323657 4891 kubelet.go:324] "Adding apiserver pod source" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.323718 4891 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.329467 4891 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.329985 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.330094 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.330233 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.330336 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.330690 4891 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.334885 4891 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.336936 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.336973 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.336985 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.336997 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337012 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337023 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337034 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337050 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337062 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337074 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337088 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.337098 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.338086 4891 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.338780 4891 server.go:1280] "Started kubelet" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.339120 4891 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.339419 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.339439 4891 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.339967 4891 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.340952 4891 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.340982 4891 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.341062 4891 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.341073 4891 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.341063 4891 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:53:40.530303472 +0000 UTC Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.341100 4891 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2215h5m50.189204826s for next certificate rotation Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.341146 4891 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.341210 4891 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 09:47:50 crc systemd[1]: Started Kubernetes Kubelet. Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.342046 4891 factory.go:55] Registering systemd factory Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.342067 4891 factory.go:221] Registration of the systemd container factory successfully Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.342221 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.342319 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.342604 4891 factory.go:153] Registering CRI-O factory Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.342661 4891 factory.go:221] Registration of the crio container factory successfully Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.344571 4891 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.347070 4891 factory.go:103] Registering Raw factory Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.347129 4891 manager.go:1196] Started watching for new ooms in manager Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.348041 4891 manager.go:319] Starting recovery of all containers Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.348170 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.349664 4891 server.go:460] "Adding debug handlers to kubelet server" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.353232 4891 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869b7d896faefe5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-29 09:47:50.338736101 +0000 UTC m=+0.543904432,LastTimestamp:2025-09-29 09:47:50.338736101 +0000 UTC m=+0.543904432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360135 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360203 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360227 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360241 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360254 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360269 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360284 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360320 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360335 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360348 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360363 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360377 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360393 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360435 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360453 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360476 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360495 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360513 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360530 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360550 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360578 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360611 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360625 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360640 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360657 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360671 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360689 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.360711 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362708 4891 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362753 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362774 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362799 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362893 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362919 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362939 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362958 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362975 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.362996 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363016 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363032 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363048 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363067 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363081 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363096 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363110 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363125 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363138 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363152 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363167 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363181 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363196 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363210 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363224 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363246 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363305 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363325 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363342 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363357 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363379 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363393 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363407 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363427 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363441 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363456 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363469 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363481 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363495 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363508 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363521 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363534 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363548 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363563 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363577 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363591 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363605 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363620 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363632 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363647 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363660 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363674 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363689 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363704 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363718 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363732 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363750 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363833 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363862 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363944 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.363969 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364016 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364036 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364055 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364118 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364154 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364218 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364244 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364298 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364328 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364352 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364408 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364430 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364491 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364516 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364533 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364586 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364615 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364635 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364684 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364705 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364747 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364768 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364823 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364844 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364910 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364935 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364954 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.364998 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365016 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365033 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365074 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365094 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365114 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365130 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365210 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365257 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365276 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365292 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365350 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365367 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365418 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365435 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365452 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365465 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365503 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365518 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365531 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365544 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365574 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365585 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365595 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365605 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365615 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365624 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365653 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365664 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365674 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365893 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365906 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365919 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365931 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365944 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365978 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.365993 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366005 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366017 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366030 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366066 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366077 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366089 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366162 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366174 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366186 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366197 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366209 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366245 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366314 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366332 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366396 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366409 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366421 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366433 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366443 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366469 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366482 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366494 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366512 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366527 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366560 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366574 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366585 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366593 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366637 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366647 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366656 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366666 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366676 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366686 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366715 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366724 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366735 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366745 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366755 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366764 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366798 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366808 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366819 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366829 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366842 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366875 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366890 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366901 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366912 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366921 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366947 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366958 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366966 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366974 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366983 4891 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366992 4891 reconstruct.go:97] "Volume reconstruction finished" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.366999 4891 reconciler.go:26] "Reconciler: start to sync state" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.370731 4891 manager.go:324] Recovery completed Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.379593 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.382425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.382481 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.382493 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.383484 4891 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.383518 4891 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.383555 4891 state_mem.go:36] "Initialized new in-memory state store" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.391660 4891 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.394405 4891 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.394469 4891 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.394509 4891 kubelet.go:2335] "Starting kubelet main sync loop" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.394562 4891 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.396411 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.396477 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.413317 4891 policy_none.go:49] "None policy: Start" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.415632 4891 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.415678 4891 state_mem.go:35] "Initializing new in-memory state store" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.442027 4891 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.487555 4891 manager.go:334] "Starting Device Plugin manager" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.487661 4891 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.487677 4891 server.go:79] "Starting device plugin registration server" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.488189 4891 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.488210 4891 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.489895 4891 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.490042 4891 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.490060 4891 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.495421 4891 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.495611 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.496961 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.497029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.497048 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.497238 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498165 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498238 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498587 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498628 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498910 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.498992 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499035 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499251 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499265 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499863 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499939 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499966 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.499971 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.500177 4891 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500377 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500417 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500586 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500607 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500620 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500765 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500887 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.500922 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501079 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501100 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501613 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501639 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.501698 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.502242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.502257 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.502267 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.549163 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.569848 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.569903 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.569932 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.569955 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.569979 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570004 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570026 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570047 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570068 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570088 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570108 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570127 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570149 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570169 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.570190 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.588576 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.589743 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.589779 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.589794 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.589840 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.590431 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671092 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671179 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671213 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671238 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671262 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671290 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671320 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671347 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671376 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671378 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671458 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671465 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671531 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671510 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671569 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671538 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671582 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671407 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671629 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671680 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671720 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671769 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671848 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671849 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671902 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671907 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671847 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671988 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.671781 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.791297 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.793140 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.793210 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.793223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.793258 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.794066 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.841897 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.848850 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.864023 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.879051 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: I0929 09:47:50.884631 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.910774 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ab7d79806babe6194dd3ff4c5c0aaf278b03ef3a858e092104cfb098f6577494 WatchSource:0}: Error finding container ab7d79806babe6194dd3ff4c5c0aaf278b03ef3a858e092104cfb098f6577494: Status 404 returned error can't find the container with id ab7d79806babe6194dd3ff4c5c0aaf278b03ef3a858e092104cfb098f6577494 Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.913210 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-219d2d5e6d33221c8ec168892c18a77c91ba13735a213ca15812d1eb29c4afda WatchSource:0}: Error finding container 219d2d5e6d33221c8ec168892c18a77c91ba13735a213ca15812d1eb29c4afda: Status 404 returned error can't find the container with id 219d2d5e6d33221c8ec168892c18a77c91ba13735a213ca15812d1eb29c4afda Sep 29 09:47:50 crc kubenswrapper[4891]: W0929 09:47:50.917343 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cd3b842a3d2b56ca32c71526cf91d414555d9c59c6d2de006f58f6df09785982 WatchSource:0}: Error finding container cd3b842a3d2b56ca32c71526cf91d414555d9c59c6d2de006f58f6df09785982: Status 404 returned error can't find the container with id cd3b842a3d2b56ca32c71526cf91d414555d9c59c6d2de006f58f6df09785982 Sep 29 09:47:50 crc kubenswrapper[4891]: E0929 09:47:50.950707 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.195082 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.196460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.196523 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.196541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.196585 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.197221 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Sep 29 09:47:51 crc kubenswrapper[4891]: W0929 09:47:51.205050 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.205156 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.340988 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:51 crc kubenswrapper[4891]: W0929 09:47:51.351905 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.352007 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.399540 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ab7d79806babe6194dd3ff4c5c0aaf278b03ef3a858e092104cfb098f6577494"} Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.400662 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b557b86a07552d2893c552adb4c6c15c90c0c6ebdd635bb8c123f147b062804"} Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.402845 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d96e9aa14da8b4f39d097f0485faf02e8b79a9bf6936cc70c6ec3e499e530a9"} Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.405246 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd3b842a3d2b56ca32c71526cf91d414555d9c59c6d2de006f58f6df09785982"} Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.409128 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"219d2d5e6d33221c8ec168892c18a77c91ba13735a213ca15812d1eb29c4afda"} Sep 29 09:47:51 crc kubenswrapper[4891]: W0929 09:47:51.551282 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.551921 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:51 crc kubenswrapper[4891]: W0929 09:47:51.720975 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.721088 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:51 crc kubenswrapper[4891]: E0929 09:47:51.751933 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.997419 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.999672 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.999726 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.999741 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:51 crc kubenswrapper[4891]: I0929 09:47:51.999776 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:52 crc kubenswrapper[4891]: E0929 09:47:52.000533 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.341006 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.415524 4891 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b27305cf97352a9d9af44c19b92b733bb8348547525917522068974d7abab852" exitCode=0 Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.415610 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b27305cf97352a9d9af44c19b92b733bb8348547525917522068974d7abab852"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.415679 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.416976 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.417008 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.417019 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.419573 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.419604 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.419615 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.419624 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.419679 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.420239 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.420269 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.420281 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.421557 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2" exitCode=0 Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.421615 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.421732 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.422638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.422673 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.422687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.423982 4891 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630" exitCode=0 Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.424051 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.424175 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.424761 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428746 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428788 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428848 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.428868 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.431674 4891 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e" exitCode=0 Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.431745 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e"} Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.431915 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.433405 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.433448 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.433458 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:52 crc kubenswrapper[4891]: I0929 09:47:52.830456 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.341068 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:53 crc kubenswrapper[4891]: E0929 09:47:53.353623 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.435669 4891 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7e1a05ef475cb21ef2da5f90990f10238d6753b968cddc7e0943bcf6ac280a90" exitCode=0 Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.435804 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7e1a05ef475cb21ef2da5f90990f10238d6753b968cddc7e0943bcf6ac280a90"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.436000 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.437897 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.437983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.438005 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.440757 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.440840 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.440858 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.444396 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f7ce3227865c2ececfa056500f90c320210ff247b8c173d45efdc901216b4968"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.444497 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.445577 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.445640 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.445654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.449366 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.449483 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.449502 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90"} Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.449413 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.449401 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451260 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451277 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451359 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451401 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.451416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:53 crc kubenswrapper[4891]: W0929 09:47:53.475979 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:53 crc kubenswrapper[4891]: E0929 09:47:53.476106 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:53 crc kubenswrapper[4891]: W0929 09:47:53.510432 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:53 crc kubenswrapper[4891]: E0929 09:47:53.510575 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.600902 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.602940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.602984 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.602998 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:53 crc kubenswrapper[4891]: I0929 09:47:53.603028 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:53 crc kubenswrapper[4891]: E0929 09:47:53.603767 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Sep 29 09:47:53 crc kubenswrapper[4891]: W0929 09:47:53.644468 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:53 crc kubenswrapper[4891]: E0929 09:47:53.644564 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.341194 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:54 crc kubenswrapper[4891]: W0929 09:47:54.409818 4891 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:54 crc kubenswrapper[4891]: E0929 09:47:54.409933 4891 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.457180 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f"} Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.457234 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088"} Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.457262 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.458304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.458348 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.458360 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459713 4891 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="279f1ccd69daf4bb63dd134c685d2a8f66ebff5cd878a0175da28d85b94c6a42" exitCode=0 Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459832 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459849 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459889 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459898 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459937 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.459946 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"279f1ccd69daf4bb63dd134c685d2a8f66ebff5cd878a0175da28d85b94c6a42"} Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461271 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461297 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461307 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461336 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461349 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461475 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.461487 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.462066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.462125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.462141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.513545 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.513705 4891 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.514088 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.814689 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:54 crc kubenswrapper[4891]: I0929 09:47:54.927209 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.341039 4891 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.464260 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e933c945c8574d96831d7762b46a0e53122b5f61d07c851709ffb4f29fe6d27a"} Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.464317 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c00f0fa0e29d94ff1993fb2e5f73476797e3c68922452eff556195f9b616dc4f"} Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.467622 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.469547 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f" exitCode=255 Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.469642 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.469649 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f"} Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.469643 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470468 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470493 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470469 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470607 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.470622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.471308 4891 scope.go:117] "RemoveContainer" containerID="57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.653865 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.654046 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.655134 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.655184 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.655194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.662635 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.831093 4891 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:47:55 crc kubenswrapper[4891]: I0929 09:47:55.831213 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.475120 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.476910 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75"} Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.476981 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.477031 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.477937 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.477965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.477975 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.480563 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.480939 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481187 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"265c019cc82e0823c56ab1e32763d48b07e833ad85b8fa38dddbcc4e586723a4"} Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481208 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9e04aabf30f748cc42407830d2e5c356e418c24d27d0c46e16a7a20a9425219f"} Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481217 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"82b79b8b65eeaecceae6b4360b41b2e9ab54f0b66a0b37811377c251847d3f2a"} Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481410 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481428 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481436 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481841 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481852 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.481860 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.578623 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.804526 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.806541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.806579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.806588 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.806646 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.900550 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 29 09:47:56 crc kubenswrapper[4891]: I0929 09:47:56.928946 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.482753 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.482849 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.482880 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.482969 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.483929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.483964 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.483974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484018 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484046 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484055 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484271 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:57 crc kubenswrapper[4891]: I0929 09:47:57.484287 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.100725 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.310123 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.485619 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.485619 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.485652 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487505 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487468 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487698 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:58 crc kubenswrapper[4891]: I0929 09:47:58.487730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:47:59 crc kubenswrapper[4891]: I0929 09:47:59.489784 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:47:59 crc kubenswrapper[4891]: I0929 09:47:59.490658 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:47:59 crc kubenswrapper[4891]: I0929 09:47:59.490688 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:47:59 crc kubenswrapper[4891]: I0929 09:47:59.490697 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:00 crc kubenswrapper[4891]: E0929 09:48:00.500974 4891 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 09:48:05 crc kubenswrapper[4891]: I0929 09:48:05.831170 4891 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:48:05 crc kubenswrapper[4891]: I0929 09:48:05.831271 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.340233 4891 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.340300 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.344643 4891 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.344781 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.585068 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.585487 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.586849 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.586893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:06 crc kubenswrapper[4891]: I0929 09:48:06.586904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.351480 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.352515 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.353996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.354041 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.354053 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.367017 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.513554 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.514774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.514847 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:08 crc kubenswrapper[4891]: I0929 09:48:08.514862 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.517885 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.518090 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.519012 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.519047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.519056 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:09 crc kubenswrapper[4891]: I0929 09:48:09.523662 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:48:10 crc kubenswrapper[4891]: E0929 09:48:10.501170 4891 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 09:48:10 crc kubenswrapper[4891]: I0929 09:48:10.517375 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:48:10 crc kubenswrapper[4891]: I0929 09:48:10.517747 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:10 crc kubenswrapper[4891]: I0929 09:48:10.524420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:10 crc kubenswrapper[4891]: I0929 09:48:10.524501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:10 crc kubenswrapper[4891]: I0929 09:48:10.524517 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.327537 4891 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.329870 4891 trace.go:236] Trace[1935078148]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:47:57.319) (total time: 14010ms): Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1935078148]: ---"Objects listed" error: 14010ms (09:48:11.329) Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1935078148]: [14.010360627s] [14.010360627s] END Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.329919 4891 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.331469 4891 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.332021 4891 trace.go:236] Trace[1538579560]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:48:00.059) (total time: 11272ms): Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1538579560]: ---"Objects listed" error: 11272ms (09:48:11.331) Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1538579560]: [11.272883872s] [11.272883872s] END Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.332064 4891 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.334132 4891 trace.go:236] Trace[1726203481]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:47:57.280) (total time: 14053ms): Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1726203481]: ---"Objects listed" error: 14053ms (09:48:11.333) Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[1726203481]: [14.053463372s] [14.053463372s] END Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.334175 4891 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.336859 4891 apiserver.go:52] "Watching apiserver" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.338447 4891 trace.go:236] Trace[855403797]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:47:59.634) (total time: 11703ms): Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[855403797]: ---"Objects listed" error: 11703ms (09:48:11.338) Sep 29 09:48:11 crc kubenswrapper[4891]: Trace[855403797]: [11.703726662s] [11.703726662s] END Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.338503 4891 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.344249 4891 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.344642 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345109 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345191 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345213 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345205 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.345298 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345402 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.345707 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.345749 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.345982 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.347832 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.347953 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.347981 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.348104 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.348204 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.348919 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.349266 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.349383 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.350255 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.350615 4891 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.378771 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.390625 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.406535 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.426159 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.438332 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.441875 4891 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451204 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451379 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451412 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451432 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451457 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451482 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451503 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451527 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451551 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451566 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451587 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451602 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451618 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451707 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451983 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451983 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.451997 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452146 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452174 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452223 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452402 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452566 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452579 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452611 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.452639 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.452714 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:11.951619361 +0000 UTC m=+22.156787672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453114 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453165 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453206 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453450 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453538 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453759 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453719 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453850 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.453876 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454104 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454124 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454052 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454187 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454334 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454389 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454395 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454440 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454441 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454467 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454498 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454522 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454545 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454572 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454599 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454608 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454622 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454645 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454675 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454696 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454715 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454738 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454754 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454764 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454773 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454832 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454845 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454868 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454879 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454934 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454954 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454961 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.454981 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455049 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455046 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455094 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455114 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455143 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455160 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.455173 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456351 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456356 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456083 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456464 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456498 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456517 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456540 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456564 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456586 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456592 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456604 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456620 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456627 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456849 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456915 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.456955 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457010 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457101 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457235 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457328 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457493 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457624 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457772 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.457922 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458059 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458604 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458674 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458697 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458730 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.458844 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.459680 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.460589 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.460678 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.460718 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.460750 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.460780 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461327 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461366 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461392 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461418 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461444 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461465 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461485 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461531 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461554 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461576 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461595 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461615 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461654 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461688 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461719 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461748 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461777 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461828 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461861 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461892 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461901 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461919 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461950 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.461990 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462038 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462076 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462074 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462113 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462146 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462179 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462267 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462304 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462335 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462366 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462403 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462441 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462472 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462502 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462541 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462572 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462602 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462633 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462669 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462700 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462736 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462768 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462836 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462871 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462912 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.462951 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463043 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463086 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463125 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463163 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463192 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463225 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463251 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463275 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463296 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463320 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463342 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463365 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463387 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463409 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463438 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463461 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463484 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463510 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463630 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463655 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463679 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463701 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463725 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463753 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463782 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463818 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463840 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463871 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463894 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463922 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463947 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463968 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.463992 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464018 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464041 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464061 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464086 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464117 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464141 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464165 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464188 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464211 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464241 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464279 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464311 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464335 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464362 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464389 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464383 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464419 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464450 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464478 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464505 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464537 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464576 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464599 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464618 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464639 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.464932 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.465283 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.465589 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.465434 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.465852 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.465900 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.466088 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.466496 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.466734 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.466756 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.467091 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.467561 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.467896 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468106 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468137 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468154 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468202 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468238 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468266 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468344 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468499 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468551 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468555 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468651 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468687 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468718 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468745 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.468773 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469472 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469560 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469599 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469629 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469661 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469696 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469711 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469748 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469779 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469819 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469841 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469893 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469927 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469958 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.469991 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470018 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470051 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470084 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470111 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470140 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470245 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470348 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470399 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470440 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470473 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470504 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470541 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470572 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470607 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470668 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470700 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470828 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470883 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470918 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470963 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471147 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471173 4891 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471187 4891 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471201 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471237 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471253 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471267 4891 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471283 4891 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471297 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471312 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471326 4891 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471344 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471359 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471402 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471419 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471437 4891 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.470016 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471159 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471416 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.471959 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.472344 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.472515 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.473280 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.473573 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.473852 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.474217 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.475140 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.475296 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.478084 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.480577 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.488518 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.488909 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.489440 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lfjwh"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.489900 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-gb8tp"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.490186 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.490554 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.491783 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.491901 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.472468 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.491889 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492090 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492175 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492309 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492377 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492523 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492665 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492814 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492896 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492971 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493082 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493163 4891 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493242 4891 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493391 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493458 4891 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493541 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493604 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493668 4891 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493737 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493831 4891 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493918 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493999 4891 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494072 4891 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494146 4891 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494225 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492678 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.492856 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494328 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494303 4891 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493027 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493169 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493202 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494405 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494433 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494451 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494468 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494482 4891 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494498 4891 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494514 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493342 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494528 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494558 4891 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494573 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494590 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494604 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494619 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494634 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494648 4891 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494670 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494701 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494716 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493369 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494730 4891 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494881 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494901 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494920 4891 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494944 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494964 4891 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494980 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494997 4891 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495013 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495028 4891 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495047 4891 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495063 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495077 4891 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495092 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495108 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495123 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495138 4891 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493404 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493448 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493591 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493650 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493864 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493891 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493884 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.493324 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.495427 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494041 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.494182 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.496149 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.496504 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.496772 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.497449 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.497869 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.498095 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.498228 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.498439 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.498543 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.499432 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.499783 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.500036 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.500148 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.500561 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.500720 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.500840 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.502314 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.503140 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.503679 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.503961 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.504496 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.504679 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.504832 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.505298 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.505816 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.506257 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.506528 4891 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.507579 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.508300 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.508920 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.509011 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:12.008982816 +0000 UTC m=+22.214151367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.511523 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.511674 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:12.011635736 +0000 UTC m=+22.216804217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.514056 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.515274 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.515290 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.516573 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.518280 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.518449 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.519716 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.520463 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.520918 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.521405 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.521410 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.521882 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.522105 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.522119 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.522322 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.522636 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.522965 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.523245 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.523359 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.523646 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.524413 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.524493 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.524760 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.524996 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.525045 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.525265 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.527061 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.527396 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.528041 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.528922 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.529872 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.530593 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.532213 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.532723 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.533298 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.533591 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.534090 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.534508 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.534720 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.535023 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.535138 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.535221 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.535371 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:12.035343979 +0000 UTC m=+22.240512490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.535270 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.535658 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.535949 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.536413 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.536672 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.536947 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.537201 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.537205 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.537377 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.537528 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538381 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538693 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538708 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538855 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538937 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.538986 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.539221 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.539243 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.539396 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.539831 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.539968 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.540318 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.541070 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.541445 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.542560 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.542586 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.542601 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:11 crc kubenswrapper[4891]: E0929 09:48:11.542674 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:12.042649839 +0000 UTC m=+22.247818160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.543817 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.544273 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.550449 4891 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33874->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.550533 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33874->192.168.126.11:17697: read: connection reset by peer" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.550951 4891 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.550981 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.551145 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.557096 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.567380 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.572088 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.585021 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.592046 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.595993 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/582de198-5a15-4c4c-aaea-881c638a42ac-proxy-tls\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596040 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/582de198-5a15-4c4c-aaea-881c638a42ac-rootfs\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596071 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5jxw\" (UniqueName: \"kubernetes.io/projected/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-kube-api-access-v5jxw\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596089 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-hosts-file\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596107 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596125 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/582de198-5a15-4c4c-aaea-881c638a42ac-mcd-auth-proxy-config\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596140 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4j9\" (UniqueName: \"kubernetes.io/projected/582de198-5a15-4c4c-aaea-881c638a42ac-kube-api-access-zk4j9\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596157 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596206 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596216 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596226 4891 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596234 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596242 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596250 4891 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596259 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596267 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596276 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596284 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596292 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596301 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596309 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596319 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596332 4891 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596341 4891 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596350 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596358 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596367 4891 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596377 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596385 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596393 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596403 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596414 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596424 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596431 4891 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596439 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596447 4891 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596455 4891 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596463 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596472 4891 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596481 4891 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596489 4891 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596498 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596509 4891 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596517 4891 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596527 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596536 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596545 4891 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596553 4891 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596564 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596580 4891 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596588 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596597 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596605 4891 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596615 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596624 4891 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596634 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596642 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596650 4891 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596659 4891 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596667 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596675 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596685 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596694 4891 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596703 4891 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596712 4891 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596721 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596729 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596737 4891 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596746 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596754 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596762 4891 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596771 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596780 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596829 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596838 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596847 4891 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596855 4891 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596864 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596874 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596882 4891 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596891 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596900 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596909 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596917 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596925 4891 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596934 4891 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596945 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596954 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596962 4891 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596972 4891 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596986 4891 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.596996 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597005 4891 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597014 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597023 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597032 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597040 4891 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597050 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597059 4891 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597068 4891 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597077 4891 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597085 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597095 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597104 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597115 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597127 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597144 4891 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597155 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597166 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597178 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597189 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597199 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597210 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597220 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597231 4891 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597242 4891 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597253 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597264 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597275 4891 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597285 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597294 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597302 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597311 4891 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597320 4891 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597329 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597337 4891 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597346 4891 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597354 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597363 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597371 4891 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597379 4891 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597443 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.597523 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.607113 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.615213 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.627261 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.638267 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.649198 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.661468 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.675283 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:48:11 crc kubenswrapper[4891]: W0929 09:48:11.676302 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d911b43c085bc39eaa67ac1be168403da49fcb55cf2dff3493c4815ed9578dcb WatchSource:0}: Error finding container d911b43c085bc39eaa67ac1be168403da49fcb55cf2dff3493c4815ed9578dcb: Status 404 returned error can't find the container with id d911b43c085bc39eaa67ac1be168403da49fcb55cf2dff3493c4815ed9578dcb Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.683138 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:48:11 crc kubenswrapper[4891]: W0929 09:48:11.687769 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d77e75e5631323e2d571d0e6df63dd88b240edc9c9f8bd6e3f9e4c34874f3809 WatchSource:0}: Error finding container d77e75e5631323e2d571d0e6df63dd88b240edc9c9f8bd6e3f9e4c34874f3809: Status 404 returned error can't find the container with id d77e75e5631323e2d571d0e6df63dd88b240edc9c9f8bd6e3f9e4c34874f3809 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698241 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5jxw\" (UniqueName: \"kubernetes.io/projected/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-kube-api-access-v5jxw\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698290 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-hosts-file\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698309 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/582de198-5a15-4c4c-aaea-881c638a42ac-mcd-auth-proxy-config\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698324 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4j9\" (UniqueName: \"kubernetes.io/projected/582de198-5a15-4c4c-aaea-881c638a42ac-kube-api-access-zk4j9\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698349 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/582de198-5a15-4c4c-aaea-881c638a42ac-proxy-tls\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698374 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/582de198-5a15-4c4c-aaea-881c638a42ac-rootfs\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698432 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/582de198-5a15-4c4c-aaea-881c638a42ac-rootfs\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.698844 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-hosts-file\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.699760 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/582de198-5a15-4c4c-aaea-881c638a42ac-mcd-auth-proxy-config\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.702713 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/582de198-5a15-4c4c-aaea-881c638a42ac-proxy-tls\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: W0929 09:48:11.708949 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8f1a7fb598fc5bbc8c98b9cec4ae9215e992e07a79cda8d4d2be68b8840ddbad WatchSource:0}: Error finding container 8f1a7fb598fc5bbc8c98b9cec4ae9215e992e07a79cda8d4d2be68b8840ddbad: Status 404 returned error can't find the container with id 8f1a7fb598fc5bbc8c98b9cec4ae9215e992e07a79cda8d4d2be68b8840ddbad Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.713341 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4j9\" (UniqueName: \"kubernetes.io/projected/582de198-5a15-4c4c-aaea-881c638a42ac-kube-api-access-zk4j9\") pod \"machine-config-daemon-gb8tp\" (UID: \"582de198-5a15-4c4c-aaea-881c638a42ac\") " pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.716251 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5jxw\" (UniqueName: \"kubernetes.io/projected/f4ba2043-c805-45e4-8a8c-aff311ac3ea5-kube-api-access-v5jxw\") pod \"node-resolver-lfjwh\" (UID: \"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\") " pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.815703 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.834843 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lfjwh" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.850953 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ngmm4"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.853080 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ngmm4" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.853082 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fs6qf"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.854917 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.856668 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5fhhd"] Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.858211 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.858218 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.858431 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.858745 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.858872 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.859990 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.860039 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.860818 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.860961 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.861016 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.861148 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.861214 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.861254 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.861361 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.864205 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.870173 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.884000 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: W0929 09:48:11.890170 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4ba2043_c805_45e4_8a8c_aff311ac3ea5.slice/crio-b2b12163da0b00f5317d0a5cb7794c2c976b8742126a6dbe2598ced1960be0d6 WatchSource:0}: Error finding container b2b12163da0b00f5317d0a5cb7794c2c976b8742126a6dbe2598ced1960be0d6: Status 404 returned error can't find the container with id b2b12163da0b00f5317d0a5cb7794c2c976b8742126a6dbe2598ced1960be0d6 Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.909147 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.924344 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.949197 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:11 crc kubenswrapper[4891]: I0929 09:48:11.975116 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001149 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001263 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-os-release\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001298 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001323 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cnibin\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001344 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-cnibin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001363 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001381 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001402 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-system-cni-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001422 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001442 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-hostroot\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001459 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-conf-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001481 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001502 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001521 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001561 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-kubelet\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001598 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-multus-certs\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001624 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.001658 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-netns\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.001836 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:13.001768498 +0000 UTC m=+23.206936819 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002002 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002031 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbx5\" (UniqueName: \"kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002049 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002065 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002084 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002100 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002114 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002132 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-system-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002147 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-k8s-cni-cncf-io\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002161 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002176 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-cni-binary-copy\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002191 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-socket-dir-parent\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002214 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4994w\" (UniqueName: \"kubernetes.io/projected/4bfce090-366c-43be-ab12-d291b4d25217-kube-api-access-4994w\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002230 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002247 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002262 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-os-release\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002291 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-bin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002311 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002326 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wscb\" (UniqueName: \"kubernetes.io/projected/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-kube-api-access-7wscb\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002354 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-etc-kubernetes\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002379 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002394 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002411 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002431 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002451 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-multus\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002478 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-multus-daemon-config\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.002513 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.007179 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.039654 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.071879 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.088460 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.100547 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103130 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-os-release\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103195 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-bin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103223 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103253 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wscb\" (UniqueName: \"kubernetes.io/projected/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-kube-api-access-7wscb\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103298 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103324 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-etc-kubernetes\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103348 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103372 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103395 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103395 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-bin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103439 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103496 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-multus-daemon-config\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103525 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-multus\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103562 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103587 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103604 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-os-release\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103625 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103649 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cnibin\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103680 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-cnibin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103701 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103719 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103743 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-system-cni-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103762 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103776 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-hostroot\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103809 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-conf-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103828 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103842 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103858 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103881 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103898 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-kubelet\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103899 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cnibin\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103937 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-multus-certs\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103914 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-multus-certs\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103965 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103988 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.103999 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-os-release\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104031 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-netns\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104041 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-cnibin\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104068 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104130 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-etc-kubernetes\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104403 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104508 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104532 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104814 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-multus-daemon-config\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104822 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-system-cni-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104876 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-os-release\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104903 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-netns\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104930 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104946 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbx5\" (UniqueName: \"kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104951 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-cni-multus\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104993 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104980 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105074 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105110 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105148 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-hostroot\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105155 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105181 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105196 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105193 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105220 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105240 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-conf-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105258 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:13.10523344 +0000 UTC m=+23.310401961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105278 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105292 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:13.105279992 +0000 UTC m=+23.310448513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.104964 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105320 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105343 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105353 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105269 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-var-lib-kubelet\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105376 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105357 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105399 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105425 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:13.105407276 +0000 UTC m=+23.310575597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105408 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105440 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105447 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105502 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-system-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105531 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-k8s-cni-cncf-io\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105560 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105587 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-cni-binary-copy\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105610 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-socket-dir-parent\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105636 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4994w\" (UniqueName: \"kubernetes.io/projected/4bfce090-366c-43be-ab12-d291b4d25217-kube-api-access-4994w\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105658 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105680 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105755 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105129 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105824 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-system-cni-dir\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.105913 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-multus-socket-dir-parent\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.105373 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.106016 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.106077 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:13.106064105 +0000 UTC m=+23.311232616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.106078 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.106094 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.106099 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bfce090-366c-43be-ab12-d291b4d25217-host-run-k8s-cni-cncf-io\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.106379 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.106498 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bfce090-366c-43be-ab12-d291b4d25217-cni-binary-copy\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.109971 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.112692 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.121911 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wscb\" (UniqueName: \"kubernetes.io/projected/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-kube-api-access-7wscb\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.123381 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbx5\" (UniqueName: \"kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5\") pod \"ovnkube-node-fs6qf\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.124268 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4994w\" (UniqueName: \"kubernetes.io/projected/4bfce090-366c-43be-ab12-d291b4d25217-kube-api-access-4994w\") pod \"multus-ngmm4\" (UID: \"4bfce090-366c-43be-ab12-d291b4d25217\") " pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.124832 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.135035 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.146885 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.158666 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.177575 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.187946 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.194289 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ngmm4" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.201658 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.212766 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.219088 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.243634 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fhhd\" (UID: \"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\") " pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: W0929 09:48:12.326086 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bb1c54_d2f0_498e_ad60_8216c29b843d.slice/crio-5a3c4615a682515d0b2ce56502f35bd34b59275f7787593663d716d848591f3a WatchSource:0}: Error finding container 5a3c4615a682515d0b2ce56502f35bd34b59275f7787593663d716d848591f3a: Status 404 returned error can't find the container with id 5a3c4615a682515d0b2ce56502f35bd34b59275f7787593663d716d848591f3a Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.400281 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.401166 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.402460 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.403347 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.404419 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.404954 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.405586 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.406731 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.407439 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.408679 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.409326 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.410633 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.411334 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.412057 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.413661 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.414500 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.416295 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.416843 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.417582 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.419012 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.419668 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.420381 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.421499 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.422544 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.423091 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.424269 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.424987 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.426198 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.426963 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.427940 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.428406 4891 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.428511 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.431108 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.431745 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.432413 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.434742 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.436179 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.436884 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.438135 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.438846 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.439776 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.440401 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.441428 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.442123 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.443073 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.443951 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.445080 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.446169 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.447729 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.448915 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.449520 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.450818 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.451997 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.452893 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.506180 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.537178 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.538627 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"3a1a2499369a7130c41c8cc90b7e89219a0a4f01cf39514ab66001b0f8737c29"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.540271 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.540911 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.544972 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75" exitCode=255 Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.545078 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.545138 4891 scope.go:117] "RemoveContainer" containerID="57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.547500 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d77e75e5631323e2d571d0e6df63dd88b240edc9c9f8bd6e3f9e4c34874f3809"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.550805 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.551160 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"5a3c4615a682515d0b2ce56502f35bd34b59275f7787593663d716d848591f3a"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.551661 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerStarted","Data":"701e8e61e32ed00250bbcd6781a484027e09c856a4bb979a0dececb14f672ae8"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.552336 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerStarted","Data":"fdd8e5a9417833ba84fc6eb7803f09075f2eed64998aafea04cf650f3d954d52"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.553878 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lfjwh" event={"ID":"f4ba2043-c805-45e4-8a8c-aff311ac3ea5","Type":"ContainerStarted","Data":"83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.553909 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lfjwh" event={"ID":"f4ba2043-c805-45e4-8a8c-aff311ac3ea5","Type":"ContainerStarted","Data":"b2b12163da0b00f5317d0a5cb7794c2c976b8742126a6dbe2598ced1960be0d6"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.558139 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.558195 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d911b43c085bc39eaa67ac1be168403da49fcb55cf2dff3493c4815ed9578dcb"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.558429 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.566291 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.566351 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8f1a7fb598fc5bbc8c98b9cec4ae9215e992e07a79cda8d4d2be68b8840ddbad"} Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.573419 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.590267 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.621343 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.643773 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.660654 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.662859 4891 scope.go:117] "RemoveContainer" containerID="afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.662947 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 09:48:12 crc kubenswrapper[4891]: E0929 09:48:12.663075 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.678195 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.694706 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.712921 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.724633 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.738507 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.756647 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.776677 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.798115 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.825833 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.834929 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.841887 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.855241 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.857179 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.879357 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.908293 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.924774 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:47:55Z\\\",\\\"message\\\":\\\"W0929 09:47:54.561237 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 09:47:54.561613 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759139274 cert, and key in /tmp/serving-cert-2571413895/serving-signer.crt, /tmp/serving-cert-2571413895/serving-signer.key\\\\nI0929 09:47:54.901282 1 observer_polling.go:159] Starting file observer\\\\nW0929 09:47:54.914349 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 09:47:54.914516 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:47:54.915155 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2571413895/tls.crt::/tmp/serving-cert-2571413895/tls.key\\\\\\\"\\\\nF0929 09:47:55.280758 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.936458 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.948434 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.958695 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.974988 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:12 crc kubenswrapper[4891]: I0929 09:48:12.983896 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.001544 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.014973 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:47:55Z\\\",\\\"message\\\":\\\"W0929 09:47:54.561237 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 09:47:54.561613 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759139274 cert, and key in /tmp/serving-cert-2571413895/serving-signer.crt, /tmp/serving-cert-2571413895/serving-signer.key\\\\nI0929 09:47:54.901282 1 observer_polling.go:159] Starting file observer\\\\nW0929 09:47:54.914349 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 09:47:54.914516 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:47:54.915155 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2571413895/tls.crt::/tmp/serving-cert-2571413895/tls.key\\\\\\\"\\\\nF0929 09:47:55.280758 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.022281 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.022563 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:15.022530612 +0000 UTC m=+25.227698933 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.027297 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.037516 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.050111 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.064965 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.079032 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.091662 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.103710 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.123375 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.123436 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.123497 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.123525 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123628 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123679 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123744 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123767 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123688 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123803 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123817 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123834 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123744 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:15.123721725 +0000 UTC m=+25.328890046 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123890 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:15.123861249 +0000 UTC m=+25.329029570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123918 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:15.123911111 +0000 UTC m=+25.329079432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.123934 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:15.123926391 +0000 UTC m=+25.329094712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.137269 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.178086 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.214827 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.395710 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.395768 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.395933 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.396120 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.396369 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.396586 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.570646 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerStarted","Data":"99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558"} Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.572173 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerStarted","Data":"0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c"} Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.574725 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309"} Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.576584 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.579488 4891 scope.go:117] "RemoveContainer" containerID="afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75" Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.579671 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.580670 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546"} Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.582973 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.583101 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" exitCode=0 Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.583154 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:48:13 crc kubenswrapper[4891]: E0929 09:48:13.590919 4891 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.598816 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.616007 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.629526 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.643502 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.658660 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.673160 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.684592 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.696173 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57166ed9b36d58a2de9086705114aad0a1d198dd0c0a50352400795613a2899f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:47:55Z\\\",\\\"message\\\":\\\"W0929 09:47:54.561237 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0929 09:47:54.561613 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759139274 cert, and key in /tmp/serving-cert-2571413895/serving-signer.crt, /tmp/serving-cert-2571413895/serving-signer.key\\\\nI0929 09:47:54.901282 1 observer_polling.go:159] Starting file observer\\\\nW0929 09:47:54.914349 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0929 09:47:54.914516 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:47:54.915155 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2571413895/tls.crt::/tmp/serving-cert-2571413895/tls.key\\\\\\\"\\\\nF0929 09:47:55.280758 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.705349 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.718159 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.730073 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.762831 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.798317 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.838403 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.881840 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.915261 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.956128 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:13 crc kubenswrapper[4891]: I0929 09:48:13.998375 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.033554 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.075775 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.113767 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.165381 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.199162 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.239159 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.278923 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.328887 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.590911 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.590973 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.590986 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.590996 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.591018 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.592500 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.594143 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558" exitCode=0 Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.594217 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558"} Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.595847 4891 scope.go:117] "RemoveContainer" containerID="afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75" Sep 29 09:48:14 crc kubenswrapper[4891]: E0929 09:48:14.596026 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.617019 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.630908 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.649189 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.663219 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.678118 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.698924 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.714771 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.729410 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.740570 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.758941 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.773826 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.787907 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.801458 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.839812 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.879900 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.926187 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:14 crc kubenswrapper[4891]: I0929 09:48:14.961780 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.000926 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.046643 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.046962 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:19.046922142 +0000 UTC m=+29.252090453 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.048223 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.080656 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.120350 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.148072 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.148148 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.148184 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.148209 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148284 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148365 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148386 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:19.148358923 +0000 UTC m=+29.353527244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148393 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148388 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148445 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148471 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148491 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148518 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:19.148490307 +0000 UTC m=+29.353658818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148413 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148537 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:19.148528968 +0000 UTC m=+29.353697479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.148567 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:19.148555909 +0000 UTC m=+29.353724420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.168839 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.199479 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.243254 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.292803 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.329601 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.395647 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.395724 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.395728 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.395855 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.396011 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:15 crc kubenswrapper[4891]: E0929 09:48:15.396097 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.600752 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerStarted","Data":"39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13"} Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.605550 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.615476 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.644066 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.663334 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.679252 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.695172 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.707990 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.723474 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.738643 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.752945 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.769597 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.783998 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.800313 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:15 crc kubenswrapper[4891]: I0929 09:48:15.838706 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.612302 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13" exitCode=0 Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.612357 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13"} Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.631156 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.647934 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.660532 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.674317 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.689408 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.739749 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.750264 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.769126 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.784955 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.801828 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.817445 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.832555 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:16 crc kubenswrapper[4891]: I0929 09:48:16.848382 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:16Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.395140 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.395178 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.395673 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.395865 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.395212 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.395980 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.622633 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.625434 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184" exitCode=0 Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.625491 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.640174 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.656055 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.670536 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.689848 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.705717 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.718497 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.732234 4891 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.732737 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.734712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.734744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.734757 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.734967 4891 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.742064 4891 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.742527 4891 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.743526 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.743553 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.743563 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.743581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.743593 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.749287 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.759657 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.763299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.763333 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.763341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.763361 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.763372 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.765352 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.777301 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.779641 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.781420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.781451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.781462 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.781480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.781492 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.793232 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.793273 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.797194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.797240 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.797252 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.797273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.797313 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.807634 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.810294 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.819113 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.819192 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.819205 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.819225 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.819241 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.825319 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.834390 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:17Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:17 crc kubenswrapper[4891]: E0929 09:48:17.834555 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.836335 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.836386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.836399 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.836421 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.836436 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.939891 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.939926 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.939935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.939951 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:17 crc kubenswrapper[4891]: I0929 09:48:17.939961 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:17Z","lastTransitionTime":"2025-09-29T09:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.043030 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.043075 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.043087 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.043107 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.043120 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.103882 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5nmcv"] Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.104375 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.108830 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.109116 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.109518 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.109587 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.120836 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.134465 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.146310 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.146648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.146740 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.146880 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.147020 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.147382 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.158045 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.174556 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.188530 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.204780 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.219251 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.244494 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.249446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.249505 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.249519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.249547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.249563 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.262009 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.275880 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.279364 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/143386d4-de10-4bb9-b79e-eaf04f8247ce-host\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.279407 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/143386d4-de10-4bb9-b79e-eaf04f8247ce-serviceca\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.279432 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzs9n\" (UniqueName: \"kubernetes.io/projected/143386d4-de10-4bb9-b79e-eaf04f8247ce-kube-api-access-rzs9n\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.292696 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.309968 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.324620 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.352098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.352142 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.352152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.352173 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.352183 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.380858 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/143386d4-de10-4bb9-b79e-eaf04f8247ce-host\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.381198 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/143386d4-de10-4bb9-b79e-eaf04f8247ce-serviceca\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.381304 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzs9n\" (UniqueName: \"kubernetes.io/projected/143386d4-de10-4bb9-b79e-eaf04f8247ce-kube-api-access-rzs9n\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.381046 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/143386d4-de10-4bb9-b79e-eaf04f8247ce-host\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.382436 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/143386d4-de10-4bb9-b79e-eaf04f8247ce-serviceca\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.456728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.456813 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.456826 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.456847 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.456860 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.470233 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzs9n\" (UniqueName: \"kubernetes.io/projected/143386d4-de10-4bb9-b79e-eaf04f8247ce-kube-api-access-rzs9n\") pod \"node-ca-5nmcv\" (UID: \"143386d4-de10-4bb9-b79e-eaf04f8247ce\") " pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.560201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.560245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.560256 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.560276 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.560288 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.631544 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae" exitCode=0 Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.631591 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.643870 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.658471 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.662173 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.662218 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.662229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.662247 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.662265 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.679977 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.694234 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.710273 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.716553 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5nmcv" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.728237 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: W0929 09:48:18.738728 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143386d4_de10_4bb9_b79e_eaf04f8247ce.slice/crio-79b022f726d0576ce451a7c8a0a8c5c22d5c2cf521bb55a4836e3b4fb063aef4 WatchSource:0}: Error finding container 79b022f726d0576ce451a7c8a0a8c5c22d5c2cf521bb55a4836e3b4fb063aef4: Status 404 returned error can't find the container with id 79b022f726d0576ce451a7c8a0a8c5c22d5c2cf521bb55a4836e3b4fb063aef4 Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.746190 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.758895 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.765340 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.765377 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.765391 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.765409 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.765424 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.767625 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.783064 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.799614 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.812625 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.827121 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.850410 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:18Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.868546 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.868594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.868604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.868625 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.868637 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.972712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.972826 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.972842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.972876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:18 crc kubenswrapper[4891]: I0929 09:48:18.972895 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:18Z","lastTransitionTime":"2025-09-29T09:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.075940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.076013 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.076026 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.076048 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.076083 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.090913 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.091237 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.091202128 +0000 UTC m=+37.296370449 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.178989 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.179040 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.179050 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.179073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.179085 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.192068 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.192125 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.192155 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.192178 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192322 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192339 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192352 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192412 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.192395652 +0000 UTC m=+37.397563973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192476 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192513 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192653 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.192591678 +0000 UTC m=+37.397760009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192729 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.192688121 +0000 UTC m=+37.397856562 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192517 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192782 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192832 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.192907 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.192892047 +0000 UTC m=+37.398060398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.281527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.281580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.281593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.281612 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.281625 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.383991 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.384044 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.384057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.384076 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.384086 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.395583 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.395636 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.395710 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.395889 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.395587 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:19 crc kubenswrapper[4891]: E0929 09:48:19.396009 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.487059 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.487119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.487131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.487152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.487168 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.590316 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.590378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.590392 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.590415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.590431 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.639608 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.641542 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.641564 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.644803 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerStarted","Data":"4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.646407 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5nmcv" event={"ID":"143386d4-de10-4bb9-b79e-eaf04f8247ce","Type":"ContainerStarted","Data":"8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.646461 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5nmcv" event={"ID":"143386d4-de10-4bb9-b79e-eaf04f8247ce","Type":"ContainerStarted","Data":"79b022f726d0576ce451a7c8a0a8c5c22d5c2cf521bb55a4836e3b4fb063aef4"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.658459 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.673843 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.675348 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.676751 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.685664 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.699006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.699044 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.699061 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.699081 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.699095 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.703965 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.720568 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.734462 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.748471 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.759270 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.783233 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802471 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802516 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802548 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802433 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.802564 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.816046 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.831450 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.845961 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.863940 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.881675 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.900944 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.905067 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.905109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.905120 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.905139 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.905150 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:19Z","lastTransitionTime":"2025-09-29T09:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.925102 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.946026 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.960071 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.971758 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:19 crc kubenswrapper[4891]: I0929 09:48:19.992709 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.007915 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.008611 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.008665 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.008678 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.008699 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.008710 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.022308 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.057968 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.071020 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.087257 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.102576 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.110877 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.110925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.110936 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.110954 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.110967 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.120138 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.214013 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.214063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.214072 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.214090 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.214104 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.317699 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.317758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.317769 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.317811 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.317825 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.416884 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.425147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.425200 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.425214 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.425236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.425248 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.433575 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.467492 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.485423 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.502232 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.520032 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.527870 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.527913 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.527922 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.527942 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.527952 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.536305 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.549707 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.560382 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.585417 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.599240 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.611725 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.623095 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.631308 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.631355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.631366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.631386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.631398 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.642482 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.649599 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.735298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.735739 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.735753 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.735775 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.735809 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.838513 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.838564 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.838574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.838593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.838602 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.941522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.941575 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.941587 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.941609 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:20 crc kubenswrapper[4891]: I0929 09:48:20.941625 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:20Z","lastTransitionTime":"2025-09-29T09:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.043998 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.044043 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.044056 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.044079 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.044092 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.147358 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.147402 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.147416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.147434 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.147446 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.250526 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.250605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.250624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.250655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.250673 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.353843 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.353886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.353897 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.353919 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.353932 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.395862 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.395906 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:21 crc kubenswrapper[4891]: E0929 09:48:21.396050 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.396149 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:21 crc kubenswrapper[4891]: E0929 09:48:21.396308 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:21 crc kubenswrapper[4891]: E0929 09:48:21.396441 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.456000 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.456038 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.456047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.456062 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.456071 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.558833 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.558878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.558888 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.558906 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.558915 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.656036 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69" exitCode=0 Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.656198 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.656985 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.661277 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.661326 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.661337 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.661357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.661368 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.679707 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.692183 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.704265 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.722938 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.740195 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.755906 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772211 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772266 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772510 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772553 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.772815 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.788479 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.802670 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.819517 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.834956 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.849195 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.864557 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.875686 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.875737 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.875751 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.875774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.875817 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.877775 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:21Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.978147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.978183 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.978194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.978213 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:21 crc kubenswrapper[4891]: I0929 09:48:21.978224 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:21Z","lastTransitionTime":"2025-09-29T09:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.081626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.081679 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.081706 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.081729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.081743 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.185050 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.185121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.185131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.185150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.185179 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.297119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.297183 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.297195 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.297214 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.297225 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.404215 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.404269 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.404278 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.404300 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.404310 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.507808 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.507855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.507864 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.507884 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.507895 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.610605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.610657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.610667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.610688 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.610699 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.662804 4891 generic.go:334] "Generic (PLEG): container finished" podID="d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e" containerID="89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753" exitCode=0 Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.662871 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerDied","Data":"89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.682587 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.696063 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.709349 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.713729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.713781 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.713811 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.713833 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.713846 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.724742 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.739635 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.753633 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.766691 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.782323 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.799225 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.816489 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.817313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.817384 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.817404 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.817434 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.817452 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.830992 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.844453 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.855826 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.879020 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.920134 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.920199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.920208 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.920229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:22 crc kubenswrapper[4891]: I0929 09:48:22.920239 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:22Z","lastTransitionTime":"2025-09-29T09:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.023242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.023294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.023307 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.023330 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.023342 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.125842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.125898 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.125910 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.125930 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.125944 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.228585 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.228623 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.228632 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.228651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.228661 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.331137 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.331194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.331206 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.331227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.331240 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.395032 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.395191 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:23 crc kubenswrapper[4891]: E0929 09:48:23.395237 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.395253 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:23 crc kubenswrapper[4891]: E0929 09:48:23.395335 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:23 crc kubenswrapper[4891]: E0929 09:48:23.395547 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.434939 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.435011 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.435031 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.435056 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.435074 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.538426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.538471 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.538482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.538501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.538512 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.642966 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.643008 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.643018 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.643035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.643046 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.667137 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" event={"ID":"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e","Type":"ContainerStarted","Data":"9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.693845 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.722715 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.745148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.745190 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.745203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.745226 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.745238 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.749060 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.766386 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.787833 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.799838 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.829935 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.845585 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.847893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.847949 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.847963 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.847985 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.848001 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.867623 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.882224 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.894352 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.908435 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.929096 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.945904 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:23Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.950774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.950812 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.950821 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.950837 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:23 crc kubenswrapper[4891]: I0929 09:48:23.950847 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:23Z","lastTransitionTime":"2025-09-29T09:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.053874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.053933 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.053945 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.053975 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.053990 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.157112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.157163 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.157180 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.157201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.157219 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.259851 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.259903 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.259918 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.259939 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.259953 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.362758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.362829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.362844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.362866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.362877 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.461938 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k"] Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.462447 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.464410 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.466305 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.466353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.466364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.466386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.466400 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.467082 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.479199 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.493533 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.510678 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.526074 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.540346 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.557538 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.569243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.569293 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.569306 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.569329 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.569342 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.572290 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.577500 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.577555 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6k98\" (UniqueName: \"kubernetes.io/projected/11edc92e-b224-4b6a-a4a8-4ccf9e696341-kube-api-access-r6k98\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.577588 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.577908 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.585037 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.596567 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.608284 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.624745 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.637200 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.648150 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.658275 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.671452 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.671509 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.671522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.671544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.671559 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.676368 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.678640 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.678673 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6k98\" (UniqueName: \"kubernetes.io/projected/11edc92e-b224-4b6a-a4a8-4ccf9e696341-kube-api-access-r6k98\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.678697 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.678732 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.679324 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.679558 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.687094 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11edc92e-b224-4b6a-a4a8-4ccf9e696341-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.694508 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6k98\" (UniqueName: \"kubernetes.io/projected/11edc92e-b224-4b6a-a4a8-4ccf9e696341-kube-api-access-r6k98\") pod \"ovnkube-control-plane-749d76644c-vxv4k\" (UID: \"11edc92e-b224-4b6a-a4a8-4ccf9e696341\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.775069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.775125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.775136 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.775156 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.775166 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.782465 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" Sep 29 09:48:24 crc kubenswrapper[4891]: W0929 09:48:24.800664 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11edc92e_b224_4b6a_a4a8_4ccf9e696341.slice/crio-c98c39cf020694481be5c2a4adcce366d11d62e5eb3a40809ea49e427b8ba027 WatchSource:0}: Error finding container c98c39cf020694481be5c2a4adcce366d11d62e5eb3a40809ea49e427b8ba027: Status 404 returned error can't find the container with id c98c39cf020694481be5c2a4adcce366d11d62e5eb3a40809ea49e427b8ba027 Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.880378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.880450 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.880658 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.880682 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.880708 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.987848 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.987886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.987898 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.987919 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:24 crc kubenswrapper[4891]: I0929 09:48:24.987929 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:24Z","lastTransitionTime":"2025-09-29T09:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.089948 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.089996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.090009 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.090033 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.090047 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.192846 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.192905 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.192917 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.192938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.192949 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.295851 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.295906 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.295916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.295932 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.295942 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.394868 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.394917 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.394879 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:25 crc kubenswrapper[4891]: E0929 09:48:25.395104 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:25 crc kubenswrapper[4891]: E0929 09:48:25.395239 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:25 crc kubenswrapper[4891]: E0929 09:48:25.395361 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.398324 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.398364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.398376 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.398393 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.398405 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.501379 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.501428 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.501441 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.501461 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.501470 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.603771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.603831 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.603842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.603861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.603874 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.676570 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" event={"ID":"11edc92e-b224-4b6a-a4a8-4ccf9e696341","Type":"ContainerStarted","Data":"c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.676659 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" event={"ID":"11edc92e-b224-4b6a-a4a8-4ccf9e696341","Type":"ContainerStarted","Data":"05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.676673 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" event={"ID":"11edc92e-b224-4b6a-a4a8-4ccf9e696341","Type":"ContainerStarted","Data":"c98c39cf020694481be5c2a4adcce366d11d62e5eb3a40809ea49e427b8ba027"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.678589 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/0.log" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.681757 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007" exitCode=1 Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.681816 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.682568 4891 scope.go:117] "RemoveContainer" containerID="6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.695494 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.706734 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.706800 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.706811 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.706831 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.706843 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.713623 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.731179 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.747504 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.763284 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.778371 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.792624 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.808254 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.809723 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.809747 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.809756 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.809782 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.809806 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.828547 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.843806 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.860324 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.875781 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.889629 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.903720 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.912456 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.912498 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.912507 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.912524 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.912533 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:25Z","lastTransitionTime":"2025-09-29T09:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.920501 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.921909 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6thmw"] Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.922420 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:25 crc kubenswrapper[4891]: E0929 09:48:25.922492 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.933322 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.944569 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.955250 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.966703 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.987228 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.998352 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84hq\" (UniqueName: \"kubernetes.io/projected/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-kube-api-access-f84hq\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:25 crc kubenswrapper[4891]: I0929 09:48:25.998399 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.001285 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.013745 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.016604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.016641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.016655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.016678 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.016690 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.028198 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.039963 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.060770 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.072817 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.084039 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.099159 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84hq\" (UniqueName: \"kubernetes.io/projected/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-kube-api-access-f84hq\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.099208 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:26 crc kubenswrapper[4891]: E0929 09:48:26.099325 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.099247 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: E0929 09:48:26.099390 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:26.599373583 +0000 UTC m=+36.804541904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.112563 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119506 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84hq\" (UniqueName: \"kubernetes.io/projected/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-kube-api-access-f84hq\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119749 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119782 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119807 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.119842 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.126472 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.138375 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.150057 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.164990 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.176518 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.188449 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.201723 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.220585 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.221894 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.221928 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.221938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.221956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.221967 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.234622 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.247450 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.259116 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.270584 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.280679 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.299187 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.311937 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.324323 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.324357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.324365 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.324380 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.324389 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.326730 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.338921 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.396585 4891 scope.go:117] "RemoveContainer" containerID="afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.429511 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.429562 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.429574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.429593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.429606 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.533116 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.533165 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.533199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.533222 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.533238 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.605370 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:26 crc kubenswrapper[4891]: E0929 09:48:26.605521 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:26 crc kubenswrapper[4891]: E0929 09:48:26.605606 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:27.60558498 +0000 UTC m=+37.810753301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.636363 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.636435 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.636452 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.636471 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.636484 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.688920 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/0.log" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.691563 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.691726 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.712616 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.729861 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.739547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.739592 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.739603 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.739624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.739636 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.744606 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.758603 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.777339 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.790226 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.803980 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.815758 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.828919 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.838557 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.842282 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.842334 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.842348 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.842370 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.842383 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.856962 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.871278 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.889276 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.913587 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.927563 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.945302 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.945404 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.945420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.945708 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.945779 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:26Z","lastTransitionTime":"2025-09-29T09:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:26 crc kubenswrapper[4891]: I0929 09:48:26.950144 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.048580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.048842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.048863 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.048888 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.048918 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.111580 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.111867 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:48:43.111841678 +0000 UTC m=+53.317010009 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.151768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.151845 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.151858 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.151876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.151888 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.212670 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.212722 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.212757 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.212774 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.212897 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.212967 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:43.212947039 +0000 UTC m=+53.418115370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213050 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213078 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:43.213069703 +0000 UTC m=+53.418238024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213121 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213182 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213200 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213297 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:43.213256209 +0000 UTC m=+53.418424600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213380 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213392 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213402 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.213465 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:43.213455495 +0000 UTC m=+53.418623816 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.254929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.254972 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.254983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.255001 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.255013 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.358024 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.358063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.358075 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.358093 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.358104 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.394973 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.395046 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.395142 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.395230 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.395333 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.395398 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.395000 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.395736 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.461167 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.461474 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.461545 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.461665 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.461759 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.565502 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.565590 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.565609 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.565641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.565665 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.617641 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.617921 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.618064 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:29.618030973 +0000 UTC m=+39.823199324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.667987 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.668069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.668095 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.668125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.668145 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.697263 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.699262 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.700259 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.701490 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/1.log" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.702056 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/0.log" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.704980 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2" exitCode=1 Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.705026 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.705065 4891 scope.go:117] "RemoveContainer" containerID="6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.705833 4891 scope.go:117] "RemoveContainer" containerID="b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2" Sep 29 09:48:27 crc kubenswrapper[4891]: E0929 09:48:27.706076 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.718962 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.733543 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.757325 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.771621 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.771668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.771680 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.771700 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.771713 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.775604 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.791216 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.807066 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.823551 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.843620 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.858447 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.872989 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.874670 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.874735 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.874748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.874770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.874784 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.886523 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.899016 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.922207 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.940811 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.956278 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.970440 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.977286 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.977316 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.977327 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.977347 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.977360 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:27Z","lastTransitionTime":"2025-09-29T09:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:27 crc kubenswrapper[4891]: I0929 09:48:27.986385 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.006274 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.016944 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.024969 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.025020 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.025046 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.025066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.025082 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.034552 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.037357 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.040653 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.040687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.040700 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.040721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.040734 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.047314 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.054421 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.057908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.057949 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.057959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.057978 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.057987 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.063356 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.069711 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.073752 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.073878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.073989 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.074064 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.074127 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.079542 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.089077 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.092027 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.092534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.092705 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.092830 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.092932 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.093016 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.105666 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: E0929 09:48:28.105826 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.107734 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.107807 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.107819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.107837 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.107848 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.113092 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.128007 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.140997 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.152036 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.162823 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.178894 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.192596 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209533 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:28Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209843 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209881 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.209924 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.315069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.316682 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.316778 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.316886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.316970 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.420082 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.420130 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.420143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.420163 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.420174 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.524499 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.524853 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.525108 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.525178 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.525254 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.628497 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.628560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.628571 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.628591 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.628607 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.709571 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/1.log" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.732006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.732074 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.732092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.732116 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.732133 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.835028 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.835077 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.835088 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.835107 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.835127 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.938398 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.938442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.938451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.938470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:28 crc kubenswrapper[4891]: I0929 09:48:28.938480 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:28Z","lastTransitionTime":"2025-09-29T09:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.041406 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.041474 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.041488 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.041513 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.041535 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.143606 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.143657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.143669 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.143685 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.143695 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.245843 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.245895 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.245904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.245921 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.245931 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.349489 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.349527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.349544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.349565 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.349579 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.395115 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.395294 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.395683 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.395736 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.395772 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.395882 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.395941 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.395994 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.452149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.452200 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.452216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.452251 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.452263 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.555131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.555175 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.555185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.555205 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.555216 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.642187 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.642406 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:29 crc kubenswrapper[4891]: E0929 09:48:29.642536 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:33.642504678 +0000 UTC m=+43.847673069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.657411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.657454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.657466 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.657489 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.657501 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.759942 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.759975 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.759983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.760000 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.760010 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.862163 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.862209 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.862218 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.862233 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.862245 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.964903 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.964948 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.964960 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.964979 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:29 crc kubenswrapper[4891]: I0929 09:48:29.964992 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:29Z","lastTransitionTime":"2025-09-29T09:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.068092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.068154 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.068170 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.068193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.068207 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.170711 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.170760 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.170772 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.170813 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.170829 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.273776 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.273881 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.273892 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.273914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.273928 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.376158 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.376219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.376228 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.376249 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.376260 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.412205 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.426004 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.441945 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.459012 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.475984 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.478988 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.479030 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.479042 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.479059 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.479068 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.492326 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.512551 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.524148 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.535893 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.548843 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.560425 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.578917 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.582497 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.582544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.582556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.582602 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.582616 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.596422 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.611477 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.625655 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.649396 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.686683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.686758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.686772 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.686974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.687012 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.789974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.790028 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.790038 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.790055 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.790065 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.892681 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.892719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.892730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.892750 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.892762 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.996259 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.996303 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.996313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.996332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:30 crc kubenswrapper[4891]: I0929 09:48:30.996344 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:30Z","lastTransitionTime":"2025-09-29T09:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.099079 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.099143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.099152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.099174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.099187 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.202705 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.202766 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.202779 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.202820 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.202835 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.306174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.306220 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.306230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.306249 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.306266 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.395735 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.395823 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.395923 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.395735 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:31 crc kubenswrapper[4891]: E0929 09:48:31.396018 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:31 crc kubenswrapper[4891]: E0929 09:48:31.396172 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:31 crc kubenswrapper[4891]: E0929 09:48:31.396223 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:31 crc kubenswrapper[4891]: E0929 09:48:31.396286 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.409309 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.409374 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.409387 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.409410 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.409425 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.512212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.512275 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.512285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.512304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.512316 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.614480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.614538 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.614548 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.614570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.614581 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.717219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.717272 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.717282 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.717299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.717309 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.820074 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.820119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.820130 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.820148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.820160 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.923485 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.923543 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.923553 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.923573 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:31 crc kubenswrapper[4891]: I0929 09:48:31.923588 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:31Z","lastTransitionTime":"2025-09-29T09:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.026295 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.026352 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.026364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.026385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.026399 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.129596 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.129661 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.129675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.129698 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.129711 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.233216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.233281 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.233295 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.233322 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.233337 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.336046 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.336103 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.336121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.336143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.336157 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.438705 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.438747 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.438757 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.438774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.438816 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.542051 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.542285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.542372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.542399 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.542416 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.646455 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.646527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.646541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.646568 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.646584 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.749830 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.749879 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.749891 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.749913 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.749924 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.853055 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.853138 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.853152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.853174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.853186 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.956352 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.956426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.956440 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.956472 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:32 crc kubenswrapper[4891]: I0929 09:48:32.956487 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:32Z","lastTransitionTime":"2025-09-29T09:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.060111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.060181 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.060195 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.060221 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.060237 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.163422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.163575 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.163607 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.163648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.163672 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.266495 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.266565 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.266581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.266608 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.266622 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.369582 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.369637 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.369648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.369669 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.369682 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.395511 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.395697 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.395720 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.395733 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.395912 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.396076 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.396217 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.396308 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.473521 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.473559 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.473567 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.473583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.473593 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.576832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.576878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.576890 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.576912 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.576924 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.679460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.679525 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.679539 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.679561 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.679575 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.688953 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.689118 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:33 crc kubenswrapper[4891]: E0929 09:48:33.689210 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:41.689188007 +0000 UTC m=+51.894356328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.782206 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.782285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.782298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.782320 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.782339 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.885238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.885284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.885294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.885313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.885329 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.987830 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.987882 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.987899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.987922 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:33 crc kubenswrapper[4891]: I0929 09:48:33.987933 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:33Z","lastTransitionTime":"2025-09-29T09:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.091380 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.091424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.091436 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.091456 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.091473 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.194945 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.194986 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.194996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.195012 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.195022 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.297606 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.297671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.297683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.297707 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.297721 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.400712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.400835 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.400850 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.400870 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.400883 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.504464 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.504518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.504527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.504545 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.504556 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.606997 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.607057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.607067 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.607087 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.607098 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.710038 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.710104 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.710125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.710154 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.710177 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.813241 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.813325 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.813336 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.813359 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.813371 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.916680 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.916727 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.916740 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.916762 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:34 crc kubenswrapper[4891]: I0929 09:48:34.916774 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:34Z","lastTransitionTime":"2025-09-29T09:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.020190 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.020261 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.020284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.020315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.020341 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.122913 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.122964 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.122976 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.122996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.123007 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.225944 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.226007 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.226017 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.226035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.226047 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.328677 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.328728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.328740 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.328762 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.328777 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.395722 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.395821 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.395848 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.395849 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:35 crc kubenswrapper[4891]: E0929 09:48:35.395958 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:35 crc kubenswrapper[4891]: E0929 09:48:35.396196 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:35 crc kubenswrapper[4891]: E0929 09:48:35.396276 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:35 crc kubenswrapper[4891]: E0929 09:48:35.396326 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.431673 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.431723 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.431733 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.431751 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.431760 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.535230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.535298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.535311 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.535337 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.535350 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.639651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.639732 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.639752 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.639855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.639889 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.743202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.743280 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.743303 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.743333 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.743355 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.846744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.846811 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.846824 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.846844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.846856 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.950539 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.950600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.950610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.950631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:35 crc kubenswrapper[4891]: I0929 09:48:35.950641 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:35Z","lastTransitionTime":"2025-09-29T09:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.053332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.053389 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.053400 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.053421 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.053433 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.155565 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.155673 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.155688 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.155708 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.155718 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.258294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.258342 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.258353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.258372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.258385 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.361127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.361182 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.361194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.361218 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.361232 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.464684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.464764 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.464810 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.464840 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.464855 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.567581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.567638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.567652 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.567675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.567687 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.670560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.670611 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.670624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.670647 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.670660 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.773149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.773196 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.773208 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.773227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.773237 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.875967 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.876035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.876056 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.876089 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.876107 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.979415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.979470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.979478 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.979499 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:36 crc kubenswrapper[4891]: I0929 09:48:36.979508 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:36Z","lastTransitionTime":"2025-09-29T09:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.082470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.082516 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.082525 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.082542 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.082553 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.185531 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.185595 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.185605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.185629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.185643 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.289019 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.289086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.289099 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.289120 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.289136 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.392750 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.392862 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.392878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.392900 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.392914 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.395405 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.395417 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.395443 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.395406 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:37 crc kubenswrapper[4891]: E0929 09:48:37.395563 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:37 crc kubenswrapper[4891]: E0929 09:48:37.395839 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:37 crc kubenswrapper[4891]: E0929 09:48:37.395899 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:37 crc kubenswrapper[4891]: E0929 09:48:37.396443 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.495423 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.495479 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.495492 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.495512 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.495527 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.600770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.600852 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.600866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.600951 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.600973 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.704193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.704246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.704254 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.704273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.704282 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.806859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.806921 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.806930 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.806950 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.806961 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.910057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.910136 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.910154 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.910179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:37 crc kubenswrapper[4891]: I0929 09:48:37.910194 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:37Z","lastTransitionTime":"2025-09-29T09:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.014460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.014539 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.014564 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.014601 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.014627 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.105746 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.117160 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.117232 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.117248 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.117274 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.117290 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.130468 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.144709 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.158286 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.170511 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.188735 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae641684ddaca6f34d7cc92f72a71421f730e1437bdaa3ababaa7645c1fa007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"y.go:140\\\\nI0929 09:48:25.145708 6084 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.145977 6084 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 09:48:25.146006 6084 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:48:25.146519 6084 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:48:25.146576 6084 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:48:25.146600 6084 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:48:25.146605 6084 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:48:25.146620 6084 factory.go:656] Stopping watch factory\\\\nI0929 09:48:25.146637 6084 ovnkube.go:599] Stopped ovnkube\\\\nI0929 09:48:25.146662 6084 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:48:25.146668 6084 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:48:25.146686 6084 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.203611 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.216508 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.220121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.220162 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.220179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.220199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.220248 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.230604 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.246373 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.248743 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.248812 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.248822 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.248841 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.248853 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.262096 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.264374 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.268736 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.268782 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.268815 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.268836 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.268847 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.276182 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.279808 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.283026 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.283054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.283064 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.283081 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.283093 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.287577 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.294357 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298076 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298155 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298178 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298193 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.298925 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.310244 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.310240 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.314173 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.314231 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.314246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.314268 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.314282 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.322152 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.325850 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: E0929 09:48:38.325973 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.327607 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.327622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.327655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.327676 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.327688 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.335235 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:38Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.430543 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.430625 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.430638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.430661 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.430676 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.534275 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.534327 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.534343 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.534366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.534381 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.637452 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.637534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.637563 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.637597 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.637621 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.742224 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.742268 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.742277 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.742299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.742310 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.845719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.845771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.845780 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.845818 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.845829 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.948124 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.948203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.948218 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.948237 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:38 crc kubenswrapper[4891]: I0929 09:48:38.948269 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:38Z","lastTransitionTime":"2025-09-29T09:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.051380 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.051433 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.051442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.051460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.051471 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.153751 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.153823 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.153838 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.153859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.153870 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.257185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.257240 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.257250 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.257268 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.257280 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.360000 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.360067 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.360080 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.360103 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.360114 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.395619 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.395716 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.395747 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.395848 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:39 crc kubenswrapper[4891]: E0929 09:48:39.395859 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:39 crc kubenswrapper[4891]: E0929 09:48:39.395939 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:39 crc kubenswrapper[4891]: E0929 09:48:39.396020 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:39 crc kubenswrapper[4891]: E0929 09:48:39.396302 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.396633 4891 scope.go:117] "RemoveContainer" containerID="b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.413454 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.427193 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.440870 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.455647 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.463483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.463548 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.463562 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.463586 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.463602 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.471714 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.483998 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.495761 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.506895 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.526015 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.542491 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.554443 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.567638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.567690 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.567701 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.567721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.567733 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.568668 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.583629 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.599828 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.616454 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.636577 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.671543 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.671610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.671626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.671657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.671674 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.754121 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/1.log" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.757566 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.757751 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.774566 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.774654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.774671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.774700 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.774739 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.778372 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.794269 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.811207 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.829486 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.853233 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.874885 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.877728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.877814 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.877830 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.877853 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.877867 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.890745 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.913986 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.932695 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.947691 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.966741 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.980989 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.981034 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.981047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.981066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.981078 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:39Z","lastTransitionTime":"2025-09-29T09:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.981176 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:39 crc kubenswrapper[4891]: I0929 09:48:39.994474 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:39Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.006644 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.017743 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.037527 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.084353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.084407 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.084420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.084455 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.084468 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.187901 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.187944 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.187958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.187979 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.187990 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.258600 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.291073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.291140 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.291153 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.291179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.291194 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.394223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.394279 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.394353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.394379 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.394393 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.411085 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.423461 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.443312 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.458014 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.473400 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.487841 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.497129 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.497191 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.497237 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.497263 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.497274 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.502650 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.521088 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.537007 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.550318 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.564311 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.576280 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.587307 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.599940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.599986 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.599998 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.600016 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.600027 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.601158 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.617770 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.631291 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.702989 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.703062 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.703073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.703094 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.703105 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.761910 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/2.log" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.762580 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/1.log" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.765580 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" exitCode=1 Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.765630 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.765678 4891 scope.go:117] "RemoveContainer" containerID="b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.766316 4891 scope.go:117] "RemoveContainer" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" Sep 29 09:48:40 crc kubenswrapper[4891]: E0929 09:48:40.766557 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.783818 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.797475 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.805800 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.805829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.805837 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.805853 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.805864 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.809620 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.821689 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.835114 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.847878 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.863426 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.879501 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.892711 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.903741 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.908964 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.909019 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.909031 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.909053 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.909064 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:40Z","lastTransitionTime":"2025-09-29T09:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.926372 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2a330fe7af1b09a0b1205eb516076caebd3dcbe61ef11cfd70dd6006868c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:27Z\\\",\\\"message\\\":\\\".217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.176\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:1936, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 09:48:27.159147 6361 services_controller.go:444] Built service openshift-ingress/router-internal-default LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0929 09:48:27.159154 6361 services_controller.go:445] Built service openshift-ingress/router-internal-default LB template configs for network=default: []services.lbConfig(nil)\\\\nF0929 09:48:27.159155 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.940703 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.954485 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.971377 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.986204 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:40 crc kubenswrapper[4891]: I0929 09:48:40.999772 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.012066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.012124 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.012133 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.012153 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.012162 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.115345 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.115651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.115664 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.115686 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.115700 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.218482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.218527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.218537 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.218553 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.218564 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.321285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.321340 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.321353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.321372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.321384 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.395099 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.395231 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.395303 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.395367 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.395387 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.395481 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.395690 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.395830 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.424666 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.424737 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.424805 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.424834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.424856 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.527946 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.528001 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.528013 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.528043 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.528057 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.631069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.631125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.631139 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.631159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.631173 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.734202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.734272 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.734289 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.734315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.734330 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.772538 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/2.log" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.774150 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.774708 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.774846 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:48:57.774815374 +0000 UTC m=+67.979983815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.776973 4891 scope.go:117] "RemoveContainer" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" Sep 29 09:48:41 crc kubenswrapper[4891]: E0929 09:48:41.777172 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.791056 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.803773 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.817111 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.833130 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.837656 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.837709 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.837723 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.837745 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.837765 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.852467 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.868861 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.881549 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.893503 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.905179 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.924295 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.939901 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.941097 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.941141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.941155 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.941180 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.941196 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:41Z","lastTransitionTime":"2025-09-29T09:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.957832 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.972265 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:41 crc kubenswrapper[4891]: I0929 09:48:41.987277 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:41Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.007505 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:42Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.025997 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:42Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.044834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.044880 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.044890 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.044907 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.044917 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.147650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.147717 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.147730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.147842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.147860 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.251159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.251209 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.251221 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.251243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.251260 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.353687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.353811 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.353823 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.353845 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.353854 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.457155 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.457201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.457216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.457236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.457249 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.560079 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.560112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.560122 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.560138 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.560148 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.663077 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.663122 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.663130 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.663147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.663157 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.766222 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.766653 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.766744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.766876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.766963 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.870569 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.870626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.870637 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.870661 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.870676 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.973039 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.973086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.973096 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.973115 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:42 crc kubenswrapper[4891]: I0929 09:48:42.973127 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:42Z","lastTransitionTime":"2025-09-29T09:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.076589 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.076645 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.076660 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.076679 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.076692 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.179995 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.180040 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.180050 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.180066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.180077 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.191360 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.191538 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:49:15.191511705 +0000 UTC m=+85.396680026 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.283353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.283406 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.283422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.283444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.283455 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.293188 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.293288 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:49:15.293261436 +0000 UTC m=+85.498429757 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.293033 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.293642 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.293750 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.293821 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:49:15.293810232 +0000 UTC m=+85.498978553 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.294065 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294207 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294404 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294420 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294455 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:49:15.294445371 +0000 UTC m=+85.499613692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294707 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294777 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294833 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.294327 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.294930 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:49:15.294903315 +0000 UTC m=+85.500071836 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.386223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.386272 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.386287 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.386310 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.386321 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.396248 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.396402 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.396813 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.396857 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.396883 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.396935 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.397038 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:43 crc kubenswrapper[4891]: E0929 09:48:43.397148 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.489457 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.489522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.489536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.489564 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.489579 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.592127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.592188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.592201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.592255 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.592268 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.695190 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.695244 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.695257 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.695276 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.695337 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.798174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.798243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.798255 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.798300 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.798315 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.901236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.901296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.901309 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.901329 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:43 crc kubenswrapper[4891]: I0929 09:48:43.901345 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:43Z","lastTransitionTime":"2025-09-29T09:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.004544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.004606 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.004616 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.004638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.004652 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.107902 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.107953 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.107963 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.107982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.107993 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.210469 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.210520 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.210532 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.210552 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.210564 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.317258 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.317322 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.317336 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.317361 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.317374 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.419675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.419725 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.419738 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.419759 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.419773 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.522929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.522977 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.522994 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.523014 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.523029 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.626206 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.626266 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.626276 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.626299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.626312 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.729190 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.729243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.729255 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.729278 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.729289 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.832874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.832930 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.832942 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.832969 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.832983 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.932734 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.935483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.935524 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.935537 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.935555 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.935566 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:44Z","lastTransitionTime":"2025-09-29T09:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.947956 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.948429 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.962586 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.978329 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:44 crc kubenswrapper[4891]: I0929 09:48:44.990087 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.005644 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.023226 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.039502 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.039571 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.039583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.039604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.039616 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.041328 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.059592 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.075526 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.102867 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.122044 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.136649 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.141892 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.141943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.141965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.141984 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.141994 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.152968 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.168929 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.193647 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.211857 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.244862 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.244923 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.244935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.244954 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.244971 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.348068 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.348121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.348132 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.348154 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.348164 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.395043 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.395041 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:45 crc kubenswrapper[4891]: E0929 09:48:45.395202 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.395064 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:45 crc kubenswrapper[4891]: E0929 09:48:45.395268 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.395038 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:45 crc kubenswrapper[4891]: E0929 09:48:45.395345 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:45 crc kubenswrapper[4891]: E0929 09:48:45.395391 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.450541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.450574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.450584 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.450603 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.450616 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.553983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.554034 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.554049 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.554073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.554087 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.657583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.657642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.657655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.657680 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.657696 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.760071 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.760108 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.760118 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.760185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.760197 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.862531 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.862581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.862595 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.862617 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.862635 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.965549 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.965591 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.965599 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.965615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:45 crc kubenswrapper[4891]: I0929 09:48:45.965629 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:45Z","lastTransitionTime":"2025-09-29T09:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.068509 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.068765 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.068775 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.068824 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.068845 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.171078 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.171121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.171131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.171146 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.171155 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.273648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.273699 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.273712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.273733 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.273753 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.376554 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.376612 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.376626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.376644 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.376656 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.479508 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.479559 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.479568 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.479588 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.479602 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.582718 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.582776 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.582803 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.582824 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.582837 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.684827 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.684871 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.684882 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.684907 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.684923 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.787924 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.787967 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.787977 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.788028 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.788045 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.891105 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.891186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.891207 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.891242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.891272 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.994567 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.994632 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.994643 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.994878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:46 crc kubenswrapper[4891]: I0929 09:48:46.994890 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:46Z","lastTransitionTime":"2025-09-29T09:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.097833 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.097919 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.097933 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.097959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.097977 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.200476 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.200525 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.200536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.200557 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.200569 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.303125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.303170 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.303179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.303201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.303212 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.395568 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.395642 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.395701 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:47 crc kubenswrapper[4891]: E0929 09:48:47.395771 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.395848 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:47 crc kubenswrapper[4891]: E0929 09:48:47.396013 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:47 crc kubenswrapper[4891]: E0929 09:48:47.396139 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:47 crc kubenswrapper[4891]: E0929 09:48:47.396244 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.405727 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.405756 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.405766 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.405782 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.405826 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.508865 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.508911 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.508925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.508952 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.508966 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.612266 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.612367 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.612397 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.612439 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.612460 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.715552 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.715610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.715622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.715642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.715656 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.819103 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.819153 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.819204 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.819227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.819241 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.922836 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.922899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.922910 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.922940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:47 crc kubenswrapper[4891]: I0929 09:48:47.922952 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:47Z","lastTransitionTime":"2025-09-29T09:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.026621 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.026690 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.026706 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.026730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.026745 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.129896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.129977 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.130002 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.130038 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.130064 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.232708 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.232753 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.232763 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.232784 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.232820 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.335263 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.335296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.335305 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.335323 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.335334 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.440409 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.440480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.440499 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.440524 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.440540 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.544503 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.544560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.544576 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.544596 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.544607 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.639985 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.640065 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.640085 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.640120 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.640145 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.658350 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:48Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.663065 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.663109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.663121 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.663141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.663156 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.678699 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:48Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.683775 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.683855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.683866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.683885 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.683896 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.697277 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:48Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.703109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.703170 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.703183 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.703203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.703216 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.716664 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:48Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.721113 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.721157 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.721168 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.721188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.721201 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.734406 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:48Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:48 crc kubenswrapper[4891]: E0929 09:48:48.734541 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.736509 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.736544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.736555 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.736575 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.736589 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.843819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.843877 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.843890 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.843917 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.843935 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.946712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.946771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.946783 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.946831 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:48 crc kubenswrapper[4891]: I0929 09:48:48.946848 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:48Z","lastTransitionTime":"2025-09-29T09:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.049282 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.049327 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.049339 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.049360 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.049387 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.151921 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.151964 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.151973 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.151991 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.152001 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.254447 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.254493 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.254526 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.254544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.254554 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.357173 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.357245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.357324 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.357352 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.357365 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.394969 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.394999 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.395054 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:49 crc kubenswrapper[4891]: E0929 09:48:49.395121 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.395198 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:49 crc kubenswrapper[4891]: E0929 09:48:49.395359 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:49 crc kubenswrapper[4891]: E0929 09:48:49.395434 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:49 crc kubenswrapper[4891]: E0929 09:48:49.395555 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.459990 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.460033 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.460042 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.460058 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.460067 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.562837 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.562886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.562896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.562914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.562925 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.676593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.676681 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.676702 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.676730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.676749 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.779562 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.779619 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.779633 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.779656 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.779670 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.882294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.882340 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.882356 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.882375 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.882385 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.984861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.984904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.984912 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.984929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:49 crc kubenswrapper[4891]: I0929 09:48:49.984939 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:49Z","lastTransitionTime":"2025-09-29T09:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.088529 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.088610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.088625 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.088654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.088671 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.190987 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.191042 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.191052 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.191070 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.191081 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.294057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.294093 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.294101 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.294117 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.294128 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.395952 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.395998 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.396007 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.396029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.396039 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.406909 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.419206 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.430475 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.441210 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.452111 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.463940 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.484380 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498280 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498321 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498253 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498334 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.498681 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.517823 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.531137 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.544409 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.555808 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.569172 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.580210 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.594879 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.601032 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.601070 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.601080 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.601097 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.601110 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.608130 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.619413 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:50Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.703502 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.703534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.703542 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.703558 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.703567 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.806186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.806227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.806236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.806251 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.806262 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.909769 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.909885 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.909902 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.909926 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:50 crc kubenswrapper[4891]: I0929 09:48:50.909940 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:50Z","lastTransitionTime":"2025-09-29T09:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.012604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.012666 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.012676 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.012697 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.012709 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.115500 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.115549 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.115560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.115580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.115591 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.219006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.219073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.219087 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.219109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.219120 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.321470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.321518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.321564 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.321583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.321595 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.395010 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.395058 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.395029 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.395225 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:51 crc kubenswrapper[4891]: E0929 09:48:51.395182 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:51 crc kubenswrapper[4891]: E0929 09:48:51.395330 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:51 crc kubenswrapper[4891]: E0929 09:48:51.395382 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:51 crc kubenswrapper[4891]: E0929 09:48:51.395569 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.424834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.424898 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.424912 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.424935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.424948 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.527651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.527695 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.527705 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.527721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.527734 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.631353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.631411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.631426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.631454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.631472 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.734016 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.734083 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.734101 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.734124 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.734142 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.837278 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.837416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.837430 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.837450 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.837465 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.941333 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.941394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.941407 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.941427 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:51 crc kubenswrapper[4891]: I0929 09:48:51.941441 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:51Z","lastTransitionTime":"2025-09-29T09:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.044261 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.044323 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.044341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.044361 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.044376 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.147115 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.147155 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.147167 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.147186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.147197 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.250442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.250475 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.250484 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.250498 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.250509 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.354122 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.354174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.354188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.354213 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.354225 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.457758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.457846 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.457859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.457882 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.457896 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.562113 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.562164 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.562177 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.562199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.562214 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.665064 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.665113 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.665148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.665166 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.665176 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.768193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.768266 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.768284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.768315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.768393 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.872092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.872145 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.872156 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.872174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.872185 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.975344 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.975397 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.975408 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.975430 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:52 crc kubenswrapper[4891]: I0929 09:48:52.975445 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:52Z","lastTransitionTime":"2025-09-29T09:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.078301 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.078415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.078425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.078445 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.078458 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.181995 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.182053 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.182071 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.182093 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.182107 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.284579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.284650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.284662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.284683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.284721 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.387296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.387375 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.387391 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.387411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.387422 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.394734 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.394839 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.394847 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.394844 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:53 crc kubenswrapper[4891]: E0929 09:48:53.394987 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:53 crc kubenswrapper[4891]: E0929 09:48:53.395204 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:53 crc kubenswrapper[4891]: E0929 09:48:53.395291 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:53 crc kubenswrapper[4891]: E0929 09:48:53.395413 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.489616 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.489665 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.489676 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.489694 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.489705 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.592388 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.592443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.592452 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.592503 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.592512 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.694856 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.694908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.694916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.694933 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.694943 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.798071 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.798113 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.798135 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.798157 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.798170 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.900979 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.901024 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.901033 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.901050 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:53 crc kubenswrapper[4891]: I0929 09:48:53.901061 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:53Z","lastTransitionTime":"2025-09-29T09:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.003488 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.003523 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.003533 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.003548 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.003559 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.106832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.106939 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.106956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.106981 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.106995 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.209614 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.209675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.209684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.209730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.209742 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.312849 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.312893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.312904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.312925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.312938 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.415574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.415651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.415672 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.415696 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.415719 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.518650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.518704 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.518722 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.518746 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.518764 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.620615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.620659 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.620668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.620684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.620696 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.723594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.723648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.723660 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.723682 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.723695 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.826199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.826256 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.826269 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.826291 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.826305 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.929687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.929733 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.929744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.929768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:54 crc kubenswrapper[4891]: I0929 09:48:54.929780 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:54Z","lastTransitionTime":"2025-09-29T09:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.031732 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.031865 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.031880 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.031901 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.031913 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.134817 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.134864 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.134876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.134897 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.134908 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.238174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.238217 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.238226 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.238241 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.238254 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.340738 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.340809 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.340820 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.340837 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.340849 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.395170 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:55 crc kubenswrapper[4891]: E0929 09:48:55.395341 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.396406 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:55 crc kubenswrapper[4891]: E0929 09:48:55.396487 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.396482 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.396530 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:55 crc kubenswrapper[4891]: E0929 09:48:55.396558 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:55 crc kubenswrapper[4891]: E0929 09:48:55.396653 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.396880 4891 scope.go:117] "RemoveContainer" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" Sep 29 09:48:55 crc kubenswrapper[4891]: E0929 09:48:55.397212 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.443080 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.443149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.443163 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.443187 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.443210 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.546098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.546149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.546159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.546178 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.546191 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.648600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.648719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.648735 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.648758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.648770 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.751601 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.751641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.751650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.751666 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.751677 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.854537 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.854577 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.854588 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.854606 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.854619 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.957321 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.957368 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.957378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.957396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:55 crc kubenswrapper[4891]: I0929 09:48:55.957407 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:55Z","lastTransitionTime":"2025-09-29T09:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.060135 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.060191 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.060207 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.060228 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.060239 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.163216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.163315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.163329 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.163352 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.163364 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.265941 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.265978 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.265987 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.266004 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.266014 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.368728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.368815 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.368830 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.368852 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.368868 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.471958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.472005 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.472035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.472054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.472064 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.574522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.574569 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.574580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.574597 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.574606 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.676685 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.676819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.676835 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.676885 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.676897 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.779362 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.779427 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.779439 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.779462 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.779476 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.882879 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.882971 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.882982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.883015 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.883027 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.985147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.985203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.985216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.985240 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:56 crc kubenswrapper[4891]: I0929 09:48:56.985254 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:56Z","lastTransitionTime":"2025-09-29T09:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.087683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.087731 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.087740 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.087759 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.087772 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.190867 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.190911 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.190921 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.190940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.190953 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.293768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.293819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.293829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.293849 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.293861 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.395233 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.395261 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.395304 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.395344 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.395430 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.395525 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.395687 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.395841 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.397069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.397109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.397124 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.397145 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.397159 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.500403 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.500452 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.500463 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.500482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.500492 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.607663 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.607731 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.607748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.607772 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.608856 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.711899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.711982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.712005 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.712037 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.712055 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.814166 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.814213 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.814223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.814240 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.814254 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.862524 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.862705 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:57 crc kubenswrapper[4891]: E0929 09:48:57.862781 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:49:29.862759846 +0000 UTC m=+100.067928167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.916936 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.916977 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.916987 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.917006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:57 crc kubenswrapper[4891]: I0929 09:48:57.917017 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:57Z","lastTransitionTime":"2025-09-29T09:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.021005 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.021071 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.021082 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.021100 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.021111 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.123815 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.123859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.123870 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.123888 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.123899 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.225901 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.225947 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.225959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.225978 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.225988 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.329091 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.329143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.329154 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.329174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.329186 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.431773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.431836 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.431847 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.431868 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.431878 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.534816 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.534855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.534868 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.534887 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.534932 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.637325 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.637369 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.637377 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.637393 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.637402 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.739944 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.740017 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.740034 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.740057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.740072 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.842649 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.842713 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.842729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.842749 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.842763 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.895634 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.895690 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.895698 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.895716 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.895726 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.908611 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.913781 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.913857 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.913872 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.913891 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.913903 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.927206 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.930970 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.931006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.931017 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.931035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.931046 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.943633 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.946628 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.946662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.946674 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.946692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.946704 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.957138 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.960731 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.960763 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.960774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.960807 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.960820 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.972525 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:48:58 crc kubenswrapper[4891]: E0929 09:48:58.972691 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.974516 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.974556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.974569 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.974587 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:58 crc kubenswrapper[4891]: I0929 09:48:58.974600 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:58Z","lastTransitionTime":"2025-09-29T09:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.076906 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.076986 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.077001 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.077024 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.077036 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.179639 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.179690 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.179700 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.179720 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.179731 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.282560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.282618 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.282631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.282652 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.282663 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.385073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.385119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.385129 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.385147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.385159 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.395641 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.395724 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:48:59 crc kubenswrapper[4891]: E0929 09:48:59.395997 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.395760 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:48:59 crc kubenswrapper[4891]: E0929 09:48:59.396258 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.395734 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:48:59 crc kubenswrapper[4891]: E0929 09:48:59.396025 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:48:59 crc kubenswrapper[4891]: E0929 09:48:59.396611 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.487169 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.487215 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.487224 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.487245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.487255 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.589825 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.589900 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.589915 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.589938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.589954 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.692085 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.692133 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.692143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.692160 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.692170 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.794537 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.794590 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.794601 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.794622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.794636 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.897903 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.897974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.897990 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.898013 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:48:59 crc kubenswrapper[4891]: I0929 09:48:59.898028 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:48:59Z","lastTransitionTime":"2025-09-29T09:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.001558 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.001622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.001633 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.001655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.001669 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.103918 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.104221 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.104296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.104374 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.104437 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.207357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.207409 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.207421 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.207443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.207457 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.310385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.310422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.310432 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.310448 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.310457 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.411920 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.412842 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.412867 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.412876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.412893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.412906 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.426928 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.441961 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.458604 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.473143 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.484953 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.497919 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.512224 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.515355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.515513 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.515599 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.515808 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.515900 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.527841 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.544572 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.558305 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.573870 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.588670 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.609270 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.618246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.618305 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.618319 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.618338 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.618349 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.622775 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.636443 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.651890 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.724985 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.725047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.725096 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.725159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.725185 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.828075 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.828115 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.828125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.828141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.828151 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.930690 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.930736 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.930746 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.930764 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:00 crc kubenswrapper[4891]: I0929 09:49:00.930775 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:00Z","lastTransitionTime":"2025-09-29T09:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.033408 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.033454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.033466 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.033484 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.033496 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.136015 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.136053 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.136062 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.136076 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.136085 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.238761 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.238893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.238908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.238929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.238944 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.341766 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.341832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.341843 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.341860 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.341874 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.395204 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:01 crc kubenswrapper[4891]: E0929 09:49:01.395359 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.395477 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:01 crc kubenswrapper[4891]: E0929 09:49:01.395700 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.395867 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.395970 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:01 crc kubenswrapper[4891]: E0929 09:49:01.396040 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:01 crc kubenswrapper[4891]: E0929 09:49:01.396146 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.444315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.444378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.444390 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.444411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.444424 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.547466 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.547986 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.548032 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.548060 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.548071 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.650829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.650874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.650884 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.650904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.650913 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.753098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.753188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.753203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.753223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.753235 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.853085 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/0.log" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.853377 4891 generic.go:334] "Generic (PLEG): container finished" podID="4bfce090-366c-43be-ab12-d291b4d25217" containerID="0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c" exitCode=1 Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.853476 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerDied","Data":"0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.853972 4891 scope.go:117] "RemoveContainer" containerID="0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.854663 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.855473 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.855571 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.855719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.855831 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.865197 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.874154 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.895503 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.910768 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.925471 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.941338 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.954853 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.958748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.958818 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.958839 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.958863 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.958880 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:01Z","lastTransitionTime":"2025-09-29T09:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.967804 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:01 crc kubenswrapper[4891]: I0929 09:49:01.989772 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.009206 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.022827 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.035623 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.048585 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.060740 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.061101 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.061148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.061165 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.061194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.061209 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.073411 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.089524 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.099436 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.162914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.162956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.162965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.162980 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.162989 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.265649 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.265679 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.265692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.265707 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.265717 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.368493 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.368531 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.368544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.368560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.368570 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.471600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.471667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.471687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.471710 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.471725 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.574188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.574226 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.574238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.574258 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.574272 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.676768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.676838 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.676851 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.676872 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.676887 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.781644 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.781798 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.781813 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.781834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.781849 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.858824 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/0.log" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.858893 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerStarted","Data":"d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.869266 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.885659 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.886300 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.886332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.886341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.886357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.886366 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.898479 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.912438 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.923649 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.935292 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.946616 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.960866 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.977564 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.988701 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.988746 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.988754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.988773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.988796 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:02Z","lastTransitionTime":"2025-09-29T09:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:02 crc kubenswrapper[4891]: I0929 09:49:02.992610 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.003286 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.013142 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.023434 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.036079 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.048161 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.063189 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.073806 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.091764 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.091971 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.092145 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.092296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.092384 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.195060 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.195104 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.195115 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.195135 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.195151 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.297938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.298285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.298366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.298454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.298532 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.394989 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:03 crc kubenswrapper[4891]: E0929 09:49:03.395202 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.395232 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.395393 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:03 crc kubenswrapper[4891]: E0929 09:49:03.395453 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.395024 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:03 crc kubenswrapper[4891]: E0929 09:49:03.395581 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:03 crc kubenswrapper[4891]: E0929 09:49:03.395669 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.400615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.400658 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.400667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.400684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.400694 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.503345 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.503386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.503396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.503413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.503426 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.606364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.606861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.606957 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.607054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.607125 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.709949 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.710010 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.710024 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.710047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.710062 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.812137 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.812186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.812202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.812226 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.812241 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.926459 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.926519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.926540 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.926563 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:03 crc kubenswrapper[4891]: I0929 09:49:03.926577 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:03Z","lastTransitionTime":"2025-09-29T09:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.029438 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.029491 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.029500 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.029520 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.029532 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.132379 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.132675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.132741 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.132821 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.132898 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.235935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.236546 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.236771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.236919 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.237004 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.339721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.340072 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.340210 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.340321 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.340441 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.444193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.444673 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.444766 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.444884 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.444974 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.547111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.547206 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.547223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.547244 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.547256 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.649433 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.649775 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.649916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.650023 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.650140 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.753877 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.754369 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.754519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.754674 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.754851 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.857814 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.857962 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.857985 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.858012 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.858026 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.961505 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.961579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.961602 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.961631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:04 crc kubenswrapper[4891]: I0929 09:49:04.961653 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:04Z","lastTransitionTime":"2025-09-29T09:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.064816 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.064886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.064902 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.064926 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.064939 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.167367 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.167672 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.167750 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.167905 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.167982 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.270604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.271063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.271171 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.271298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.271360 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.374382 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.374446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.374458 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.374480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.374494 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.395759 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:05 crc kubenswrapper[4891]: E0929 09:49:05.395967 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.396206 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:05 crc kubenswrapper[4891]: E0929 09:49:05.396278 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.396429 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:05 crc kubenswrapper[4891]: E0929 09:49:05.396500 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.396644 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:05 crc kubenswrapper[4891]: E0929 09:49:05.396712 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.477049 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.477103 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.477116 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.477132 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.477142 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.579729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.579806 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.579819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.579840 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.579854 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.683597 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.683649 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.683658 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.683677 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.683693 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.786421 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.786482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.786495 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.786519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.786533 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.888996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.889047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.889060 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.889080 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.889094 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.991298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.991384 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.991413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.991445 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:05 crc kubenswrapper[4891]: I0929 09:49:05.991466 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:05Z","lastTransitionTime":"2025-09-29T09:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.094415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.094490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.094508 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.094534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.094548 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.198814 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.198866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.198879 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.198904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.198917 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.302086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.302152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.302164 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.302184 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.302197 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.403832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.403886 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.403904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.403925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.403943 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.508073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.508594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.508824 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.509054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.509235 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.611677 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.611710 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.611721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.611736 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.611745 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.714750 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.715082 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.715187 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.715307 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.715404 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.817561 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.817915 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.818011 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.818111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.818198 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.921299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.921357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.921373 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.921393 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:06 crc kubenswrapper[4891]: I0929 09:49:06.921410 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:06Z","lastTransitionTime":"2025-09-29T09:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.024209 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.024289 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.024305 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.024332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.024385 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.127961 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.128028 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.128043 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.128068 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.128087 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.231344 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.231426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.231444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.231483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.231504 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.334467 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.334519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.334534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.334558 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.334573 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.395698 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.395733 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.395738 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.395758 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:07 crc kubenswrapper[4891]: E0929 09:49:07.395865 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:07 crc kubenswrapper[4891]: E0929 09:49:07.395999 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:07 crc kubenswrapper[4891]: E0929 09:49:07.396059 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:07 crc kubenswrapper[4891]: E0929 09:49:07.396111 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.438238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.438287 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.438299 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.438321 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.438333 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.541556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.541605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.541620 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.541652 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.541665 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.644773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.644833 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.644846 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.644866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.644879 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.747680 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.747741 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.747755 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.747779 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.747810 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.851394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.851446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.851460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.851480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.851496 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.955337 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.955391 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.955399 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.955416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:07 crc kubenswrapper[4891]: I0929 09:49:07.955430 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:07Z","lastTransitionTime":"2025-09-29T09:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.058238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.058298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.058309 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.058341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.058355 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.162122 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.162241 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.162256 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.162296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.162335 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.266297 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.266371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.266413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.266443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.266459 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.369290 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.369349 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.369367 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.369395 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.369413 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.472314 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.472379 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.472405 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.472442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.472466 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.575522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.575562 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.575572 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.575591 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.575603 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.678119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.678196 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.678215 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.678251 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.678272 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.780816 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.780887 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.780913 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.780937 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.780952 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.883844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.883893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.883903 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.883921 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.883933 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.986724 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.986770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.986783 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.986829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:08 crc kubenswrapper[4891]: I0929 09:49:08.986842 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:08Z","lastTransitionTime":"2025-09-29T09:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.089716 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.089761 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.089771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.089798 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.089808 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.192139 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.192178 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.192186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.192201 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.192210 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.265052 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.265127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.265140 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.265161 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.265174 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.283750 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.288500 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.288539 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.288570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.288592 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.288606 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.300537 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.306109 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.306138 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.306149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.306168 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.306180 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.320654 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.325169 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.325200 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.325211 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.325229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.325241 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.342641 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.346574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.346629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.346645 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.346668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.346683 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.362083 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.362210 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.364169 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.364229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.364244 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.364260 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.364269 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.394939 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.395078 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.395253 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.395309 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.395431 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.395490 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.395589 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:09 crc kubenswrapper[4891]: E0929 09:49:09.395629 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.467401 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.467463 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.467476 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.467496 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.467511 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.571269 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.571356 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.571375 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.571410 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.571429 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.674580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.674629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.674642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.674663 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.674676 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.777582 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.777639 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.777655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.777675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.777690 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.881231 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.881306 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.881325 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.881357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.881378 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.984926 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.985007 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.985027 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.985059 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:09 crc kubenswrapper[4891]: I0929 09:49:09.985077 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:09Z","lastTransitionTime":"2025-09-29T09:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.088469 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.088554 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.088573 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.088603 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.088624 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.192710 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.192770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.192828 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.192867 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.192886 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.296869 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.296935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.296951 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.296982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.297001 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.395714 4891 scope.go:117] "RemoveContainer" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.398726 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.398773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.398786 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.398824 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.398837 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.414242 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.429774 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.443296 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.455669 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.475628 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.489562 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.502518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.502547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.502555 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.502571 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.502581 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.507669 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.527445 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.546548 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.570975 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.589630 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.605729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.605782 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.605821 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.605845 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.605859 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.606555 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.629734 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.644514 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.657368 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.684498 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.698245 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.707962 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.708010 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.708020 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.708038 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.708050 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.810491 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.810557 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.810575 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.810604 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.810623 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.902829 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/2.log" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.906081 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.906647 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.912635 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.912684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.912696 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.912717 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.912730 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:10Z","lastTransitionTime":"2025-09-29T09:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.939331 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.964655 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.978290 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:10 crc kubenswrapper[4891]: I0929 09:49:10.993852 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.005753 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.015701 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.015754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.015765 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.015807 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.015845 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.022143 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.037730 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.053740 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.068368 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.085734 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.098731 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.118323 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.118359 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.118369 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.118386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.118405 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.128846 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.157204 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.177723 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.192439 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.210903 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.221225 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.221345 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.221358 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.221376 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.221392 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.226336 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.324422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.324478 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.324487 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.324507 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.324519 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.395490 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.395557 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:11 crc kubenswrapper[4891]: E0929 09:49:11.395645 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.395507 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.395508 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:11 crc kubenswrapper[4891]: E0929 09:49:11.395712 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:11 crc kubenswrapper[4891]: E0929 09:49:11.395811 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:11 crc kubenswrapper[4891]: E0929 09:49:11.396008 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.427482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.427536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.427547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.427568 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.427581 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.530318 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.530362 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.530374 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.530392 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.530403 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.632954 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.633016 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.633029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.633050 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.633060 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.736058 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.736101 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.736111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.736129 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.736140 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.839263 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.839310 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.839323 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.839342 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.839356 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.911177 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/3.log" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.912134 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/2.log" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.914856 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" exitCode=1 Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.914910 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.914953 4891 scope.go:117] "RemoveContainer" containerID="57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.915670 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:49:11 crc kubenswrapper[4891]: E0929 09:49:11.915842 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.939658 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c0bdf138e4d3f03e6f6ff40e03c961d0bd63ff33b48b6ee8b749385e394c93\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:48:40Z\\\",\\\"message\\\":\\\"er for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:48:40Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:48:40.329821 6539 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-diagnostics/network-check-target]} name:Service_openshift-network-diagnostics/network-check-target_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:48:40.329605 6539 model_client.go:382] Update o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:11Z\\\",\\\"message\\\":\\\"se, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 09:49:11.349061 6921 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:49:11.346150 6921 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nF0929 09:49:11.349189 6921 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.942305 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.942357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.942371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.942391 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.942402 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:11Z","lastTransitionTime":"2025-09-29T09:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.954068 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.974066 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:11 crc kubenswrapper[4891]: I0929 09:49:11.987673 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.000744 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.011499 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.026831 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.042048 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.045846 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.045893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.045909 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.045929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.045948 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.057355 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.072903 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.091208 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.106758 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.121661 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.137143 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.148955 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.149343 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.149416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.149496 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.149408 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.149579 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.167721 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.233726 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.253518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.253564 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.253577 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.253594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.253604 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.355953 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.355998 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.356013 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.356035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.356049 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.458863 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.458947 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.458974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.459012 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.459035 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.561683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.561729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.561738 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.561754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.561764 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.664920 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.665176 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.665185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.665199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.665210 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.770470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.770529 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.770542 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.770665 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.770693 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.873394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.873447 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.873460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.873481 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.873493 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.927033 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/3.log" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.931595 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:49:12 crc kubenswrapper[4891]: E0929 09:49:12.931811 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.945709 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.957998 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.975673 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.976536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.976585 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.976596 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.976615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.976624 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:12Z","lastTransitionTime":"2025-09-29T09:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:12 crc kubenswrapper[4891]: I0929 09:49:12.989634 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.002949 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.015827 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.032747 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.049646 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.062403 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.078672 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.078705 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.078714 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.078730 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.078739 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.088336 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.099331 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.176131 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.181640 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.181693 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.181706 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.181732 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.181743 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.196524 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.211470 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.225080 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.268810 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.283966 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.284006 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.284017 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.284033 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.284043 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.287328 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:11Z\\\",\\\"message\\\":\\\"se, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 09:49:11.349061 6921 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:49:11.346150 6921 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nF0929 09:49:11.349189 6921 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:49:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:13Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.386694 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.386748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.386760 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.386778 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.386804 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.394983 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:13 crc kubenswrapper[4891]: E0929 09:49:13.395315 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.395114 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:13 crc kubenswrapper[4891]: E0929 09:49:13.395735 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.395057 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:13 crc kubenswrapper[4891]: E0929 09:49:13.397045 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.395140 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:13 crc kubenswrapper[4891]: E0929 09:49:13.397282 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.488637 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.488675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.488686 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.488704 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.488717 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.591642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.591688 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.591699 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.591717 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.591726 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.694338 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.694401 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.694417 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.694444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.694463 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.797332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.797385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.797394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.797412 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.797421 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.900150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.900188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.900199 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.900213 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:13 crc kubenswrapper[4891]: I0929 09:49:13.900224 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:13Z","lastTransitionTime":"2025-09-29T09:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.002451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.002518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.002541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.002569 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.002586 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.104636 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.104677 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.104692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.104712 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.104726 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.206615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.206655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.206667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.206686 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.206697 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.312347 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.312403 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.312416 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.312435 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.312451 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.414778 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.414839 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.414852 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.414871 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.414882 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.516670 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.516723 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.516735 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.516754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.516766 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.618974 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.619015 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.619025 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.619042 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.619145 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.721907 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.721958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.721970 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.721988 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.722001 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.824548 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.824590 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.824616 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.824638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.824653 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.927343 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.927663 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.927747 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.927876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:14 crc kubenswrapper[4891]: I0929 09:49:14.927968 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:14Z","lastTransitionTime":"2025-09-29T09:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.030603 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.030644 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.030655 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.030674 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.030686 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.133827 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.133866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.133878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.133896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.133907 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.196422 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.196548 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.19652212 +0000 UTC m=+149.401690451 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.236450 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.236496 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.236506 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.236524 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.236534 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.298027 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.299381 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.298253 4891 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.300345 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300385 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.300361924 +0000 UTC m=+149.505530245 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.300503 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300044 4891 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300579 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.30057212 +0000 UTC m=+149.505740441 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300736 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300758 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300771 4891 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300809 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.300802747 +0000 UTC m=+149.505971068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300854 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300886 4891 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.300912 4891 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.301016 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.300985612 +0000 UTC m=+149.506153963 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.339524 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.339718 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.339744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.339768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.339786 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.395421 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.395690 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.396096 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.396242 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.396624 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.396840 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.396958 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:15 crc kubenswrapper[4891]: E0929 09:49:15.397311 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.418158 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.442527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.442582 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.442598 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.442623 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.442646 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.545922 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.545992 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.546035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.546072 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.546098 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.648932 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.648997 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.649018 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.649046 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.649068 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.752755 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.752892 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.752913 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.752949 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.752971 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.856423 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.856463 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.856474 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.856489 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.856499 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.959482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.959544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.959560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.959589 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:15 crc kubenswrapper[4891]: I0929 09:49:15.959617 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:15Z","lastTransitionTime":"2025-09-29T09:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.062241 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.062280 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.062288 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.062303 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.062313 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.164821 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.164866 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.164875 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.164896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.164906 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.267249 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.267337 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.267348 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.267363 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.267374 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.369992 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.370052 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.370066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.370088 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.370101 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.472600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.472641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.472652 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.472669 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.472678 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.575772 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.575844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.575856 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.575874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.575888 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.678308 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.678354 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.678365 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.678385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.678397 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.785245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.785394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.785421 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.785449 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.785463 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.889186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.889236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.889246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.889265 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.889279 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.991858 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.991908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.991928 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.991962 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:16 crc kubenswrapper[4891]: I0929 09:49:16.991974 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:16Z","lastTransitionTime":"2025-09-29T09:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.094583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.094636 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.094648 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.094667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.094678 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.201099 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.201141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.201151 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.201169 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.201178 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.303966 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.304000 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.304008 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.304022 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.304031 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.395204 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.395249 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.395257 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.395254 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:17 crc kubenswrapper[4891]: E0929 09:49:17.395345 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:17 crc kubenswrapper[4891]: E0929 09:49:17.395502 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:17 crc kubenswrapper[4891]: E0929 09:49:17.395636 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:17 crc kubenswrapper[4891]: E0929 09:49:17.395764 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.406093 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.406128 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.406136 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.406151 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.406161 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.508413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.508448 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.508456 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.508469 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.508477 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.610711 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.610745 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.610754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.610767 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.610783 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.713656 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.713718 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.713732 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.713748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.713785 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.816711 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.816781 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.816827 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.816854 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.816870 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.920441 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.920484 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.920492 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.920508 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:17 crc kubenswrapper[4891]: I0929 09:49:17.920520 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:17Z","lastTransitionTime":"2025-09-29T09:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.022910 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.022980 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.023001 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.023029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.023045 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.126360 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.126429 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.126443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.126465 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.126483 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.228651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.228692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.228702 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.228718 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.228729 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.331561 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.331627 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.331638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.331662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.331675 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.434094 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.434137 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.434147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.434162 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.434171 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.537405 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.537468 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.537488 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.537517 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.537539 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.640150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.640186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.640196 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.640212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.640222 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.743624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.743681 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.743692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.743716 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.743729 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.846376 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.846415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.846425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.846440 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.846451 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.950893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.950943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.950958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.950977 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:18 crc kubenswrapper[4891]: I0929 09:49:18.950990 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:18Z","lastTransitionTime":"2025-09-29T09:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.053716 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.053773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.053815 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.053835 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.053847 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.157099 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.157141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.157150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.157166 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.157176 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.260518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.260572 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.260583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.260605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.260618 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.364215 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.364295 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.364310 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.364340 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.364385 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.395487 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.395537 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.395615 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.395652 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.395537 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.395755 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.395905 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.395998 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.424091 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.424167 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.424189 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.424219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.424237 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.440476 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.445821 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.445927 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.445946 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.445965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.445980 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.459434 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.464597 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.464657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.464675 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.464702 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.464735 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.479590 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.484573 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.484634 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.484662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.484692 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.484710 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.498510 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.502324 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.502367 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.502375 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.502390 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.502401 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.514845 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:19Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:19 crc kubenswrapper[4891]: E0929 09:49:19.514991 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.516518 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.516557 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.516572 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.516591 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.516606 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.619482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.619561 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.619573 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.619593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.619605 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.722098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.722142 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.722150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.722165 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.722175 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.824574 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.824626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.824638 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.824657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.824672 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.927684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.927757 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.927770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.927804 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:19 crc kubenswrapper[4891]: I0929 09:49:19.927818 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:19Z","lastTransitionTime":"2025-09-29T09:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.029999 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.030043 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.030055 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.030107 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.030121 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.134555 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.134629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.134640 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.134662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.134672 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.236899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.236934 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.236943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.236956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.236965 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.339289 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.339355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.339365 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.339378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.339407 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.414265 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.437385 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.442778 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.442879 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.442895 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.442909 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.442919 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.454089 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.469374 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.481453 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.493432 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.504905 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.517859 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.528831 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82deea0a-506f-44cd-9018-b52635615dd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7ce3227865c2ececfa056500f90c320210ff247b8c173d45efdc901216b4968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.541218 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.545273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.545338 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.545374 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.545396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.545408 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.556250 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.571040 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.591142 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.610554 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:11Z\\\",\\\"message\\\":\\\"se, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 09:49:11.349061 6921 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:49:11.346150 6921 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nF0929 09:49:11.349189 6921 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:49:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.624692 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.637404 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.647874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.647914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.647928 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.647945 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.647958 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.650632 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.662756 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:20Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.750476 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.750878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.750890 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.750909 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.751109 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.853904 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.853966 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.853980 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.853995 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.854006 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.955910 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.955951 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.955963 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.955982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:20 crc kubenswrapper[4891]: I0929 09:49:20.955994 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:20Z","lastTransitionTime":"2025-09-29T09:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.059246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.059286 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.059297 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.059315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.059324 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.162576 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.162657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.162671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.162693 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.162708 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.265353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.265415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.265430 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.265449 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.265462 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.369037 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.369096 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.369107 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.369131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.369145 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.395568 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.395635 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.395568 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:21 crc kubenswrapper[4891]: E0929 09:49:21.395815 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:21 crc kubenswrapper[4891]: E0929 09:49:21.395949 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.396031 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:21 crc kubenswrapper[4891]: E0929 09:49:21.396113 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:21 crc kubenswrapper[4891]: E0929 09:49:21.396198 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.471565 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.471608 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.471621 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.471643 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.471658 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.577344 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.577395 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.577404 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.577424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.577435 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.680003 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.680072 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.680086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.680116 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.680158 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.782863 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.782908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.782917 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.782935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.782948 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.884906 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.884972 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.884990 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.885021 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.885041 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.988205 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.988285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.988298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.988318 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:21 crc kubenswrapper[4891]: I0929 09:49:21.988332 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:21Z","lastTransitionTime":"2025-09-29T09:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.091879 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.091943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.091956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.091980 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.091993 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.195282 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.195695 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.195759 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.195867 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.195946 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.299125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.299181 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.299193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.299206 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.299214 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.406465 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.406540 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.406554 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.406576 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.406591 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.509266 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.509326 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.509346 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.509369 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.509385 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.614108 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.614167 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.614179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.614197 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.614206 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.717015 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.717066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.717078 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.717097 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.717110 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.819910 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.819959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.819970 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.819987 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.820000 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.922443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.922487 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.922497 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.922511 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:22 crc kubenswrapper[4891]: I0929 09:49:22.922520 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:22Z","lastTransitionTime":"2025-09-29T09:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.024820 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.024876 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.024891 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.024909 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.024922 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.127409 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.127445 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.127455 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.127470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.127480 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.230149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.230195 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.230211 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.230232 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.230252 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.333051 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.333098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.333111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.333129 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.333140 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.395691 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.395779 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.395827 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:23 crc kubenswrapper[4891]: E0929 09:49:23.395851 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.395779 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:23 crc kubenswrapper[4891]: E0929 09:49:23.395927 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:23 crc kubenswrapper[4891]: E0929 09:49:23.395991 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:23 crc kubenswrapper[4891]: E0929 09:49:23.396056 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.435965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.436008 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.436018 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.436031 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.436040 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.538183 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.538220 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.538230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.538245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.538255 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.641102 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.641131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.641140 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.641152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.641161 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.744479 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.744529 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.744541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.744563 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.744576 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.848529 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.848590 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.848600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.848630 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.848645 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.951355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.951426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.951439 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.951464 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:23 crc kubenswrapper[4891]: I0929 09:49:23.951479 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:23Z","lastTransitionTime":"2025-09-29T09:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.054556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.054609 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.054621 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.054639 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.054654 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.156829 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.156874 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.156887 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.156905 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.156915 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.259341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.259394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.259411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.259426 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.259436 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.361539 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.361602 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.361610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.361628 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.361639 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.464699 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.464737 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.464746 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.464762 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.464771 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.567728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.567769 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.567777 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.567822 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.567838 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.671029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.671063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.671071 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.671086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.671096 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.773248 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.773277 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.773285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.773298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.773306 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.876119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.876157 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.876165 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.876179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.876188 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.978378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.978424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.978438 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.978457 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:24 crc kubenswrapper[4891]: I0929 09:49:24.978468 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:24Z","lastTransitionTime":"2025-09-29T09:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.080403 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.080433 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.080442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.080457 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.080469 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.183460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.183490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.183498 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.183511 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.183523 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.286581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.286629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.286640 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.286653 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.286663 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.388956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.389011 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.389027 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.389044 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.389055 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.395397 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.395479 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:25 crc kubenswrapper[4891]: E0929 09:49:25.395526 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.395415 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:25 crc kubenswrapper[4891]: E0929 09:49:25.395598 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.395415 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:25 crc kubenswrapper[4891]: E0929 09:49:25.395706 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:25 crc kubenswrapper[4891]: E0929 09:49:25.395836 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.492120 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.492178 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.492198 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.492224 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.492243 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.595897 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.595963 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.595975 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.595995 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.596011 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.699385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.699446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.699459 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.699483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.699496 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.803029 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.803086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.803102 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.803316 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.803336 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.905745 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.905836 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.905859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.905883 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:25 crc kubenswrapper[4891]: I0929 09:49:25.905900 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:25Z","lastTransitionTime":"2025-09-29T09:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.008550 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.008605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.008616 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.008635 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.008646 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.111112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.111166 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.111177 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.111198 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.111211 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.214060 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.214117 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.214129 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.214147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.214158 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.316877 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.316944 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.316956 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.316978 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.316988 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.396484 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:49:26 crc kubenswrapper[4891]: E0929 09:49:26.396660 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.419940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.420004 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.420073 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.420108 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.420131 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.523940 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.524034 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.524062 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.524098 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.524128 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.627022 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.627086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.627099 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.627126 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.627144 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.730054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.730132 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.730155 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.730185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.730203 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.832896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.832934 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.832943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.832959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.832968 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.935063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.935136 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.935158 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.935177 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:26 crc kubenswrapper[4891]: I0929 09:49:26.935193 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:26Z","lastTransitionTime":"2025-09-29T09:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.038371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.038424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.038442 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.038462 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.038474 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.141236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.141273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.141286 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.141304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.141316 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.283264 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.283315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.283329 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.283347 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.283359 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.387171 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.387238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.387252 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.387279 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.387293 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.395637 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.395737 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:27 crc kubenswrapper[4891]: E0929 09:49:27.395896 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.395937 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.395915 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:27 crc kubenswrapper[4891]: E0929 09:49:27.396139 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:27 crc kubenswrapper[4891]: E0929 09:49:27.396265 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:27 crc kubenswrapper[4891]: E0929 09:49:27.396313 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.490210 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.490294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.490306 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.490325 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.490337 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.593355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.593425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.593443 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.593470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.593487 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.696665 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.696715 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.696727 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.696744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.696755 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.799356 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.799415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.799431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.799454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.799478 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.902284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.902320 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.902328 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.902341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:27 crc kubenswrapper[4891]: I0929 09:49:27.902350 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:27Z","lastTransitionTime":"2025-09-29T09:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.005396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.005446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.005455 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.005473 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.005486 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.108141 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.108181 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.108194 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.108212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.108223 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.210608 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.210651 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.210662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.210679 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.210691 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.313193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.313235 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.313245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.313261 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.313274 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.415628 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.415670 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.415678 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.415694 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.415705 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.518166 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.518202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.518212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.518227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.518238 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.621133 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.621181 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.621197 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.621269 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.621284 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.723844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.723893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.723905 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.723925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.723939 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.826608 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.826654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.826666 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.826685 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.826696 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.929287 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.929342 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.929350 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.929365 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:28 crc kubenswrapper[4891]: I0929 09:49:28.929377 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:28Z","lastTransitionTime":"2025-09-29T09:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.032300 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.032352 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.032371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.032393 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.032407 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.135326 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.135392 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.135402 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.135418 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.135429 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.238664 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.238735 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.238750 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.238769 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.238780 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.340744 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.340783 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.340827 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.340844 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.340855 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.395471 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.395571 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.395709 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.395603 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.395841 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.395902 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.396034 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.396213 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.442823 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.442859 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.442875 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.442896 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.442908 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.546397 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.546436 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.546446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.546464 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.546474 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.576150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.576196 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.576212 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.576230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.576246 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.587406 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:29Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.592143 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.592185 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.592198 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.592218 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.592230 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.608690 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:29Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.614581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.614640 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.614652 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.614668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.614681 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.629644 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:29Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.635243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.635298 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.635308 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.635334 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.635352 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.649407 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:29Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.654642 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.654691 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.654704 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.654726 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.654746 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.674158 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:29Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.674336 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.676080 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.676118 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.676132 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.676150 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.676164 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.779265 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.779304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.779317 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.779335 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.779348 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.883049 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.883092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.883102 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.883117 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.883126 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.952248 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.952491 4891 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:49:29 crc kubenswrapper[4891]: E0929 09:49:29.952645 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs podName:45417d1e-e3f1-4cc9-9f51-65affc9d72f6 nodeName:}" failed. No retries permitted until 2025-09-29 09:50:33.952603141 +0000 UTC m=+164.157771642 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs") pod "network-metrics-daemon-6thmw" (UID: "45417d1e-e3f1-4cc9-9f51-65affc9d72f6") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.984995 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.985055 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.985067 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.985088 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:29 crc kubenswrapper[4891]: I0929 09:49:29.985103 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:29Z","lastTransitionTime":"2025-09-29T09:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.088069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.088102 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.088110 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.088125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.088136 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.190559 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.190613 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.190626 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.190653 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.190669 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.293514 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.293575 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.293588 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.293609 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.293620 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.396856 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.396922 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.396952 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.396970 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.396989 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.410321 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6953354a1c0ec9bc61fff4ab310a9e85450bd5e0eb3322157e12733d5d927546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae59705039932ed3d7bc6349e4bb6c567c7f36936eca925fefde844cb821e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.422404 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582de198-5a15-4c4c-aaea-881c638a42ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d79b856ecedfb7d47b4a9f8b2e3259d71cd19ffd836431b05ce81d14b6947309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk4j9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gb8tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.435119 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5nmcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143386d4-de10-4bb9-b79e-eaf04f8247ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bd5abe1d743a08e5cfb8625e1e0f160e531c7ca7c12c4df51b1cb1a13676c21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzs9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5nmcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.447823 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.457864 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6thmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f84hq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6thmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.466190 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82deea0a-506f-44cd-9018-b52635615dd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7ce3227865c2ececfa056500f90c320210ff247b8c173d45efdc901216b4968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54b45bdc346d2afb241021854cd164cf3af6e743dcbca474c14118a35dfaf630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.478228 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28fe23bc-3be7-483a-96ca-3bdabaf4c1bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35951596d12f87275b2231ea4f7f60041b488fe704609343961907d14b7601a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e39a5fb9c9b4ad498041f6e2c951079a51c5980aba28c48e344fb88da3bd068a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b4fa02fb948cecd171c7f6bc4aad8192106926514db2511c797b39322a97d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c701f4cfe17a0935ec9461400f4dcbe5a6ebbc4debd5bb701111d67d2f814009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.492182 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5170b609-4265-4e55-aafc-340289385106\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://841f5dc66fb1c85371cabf43884c6257b16df22fcc077987c3e00ca2c62791f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9816c522b52f355f446b433b68e6fbf39f682b0ffc178f8e81141acb1eb679b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af81ee19648c41e951a7f30f0c73f18c70099c56ea7e1074b05b6cd0e6cd9b0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://668c9c1c59bbc122cd5be3626616f5eedc5451c880f2b1fd5107e66b0d0f1ef8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afae0aefbaf256efdc9b5b60c2dbf130e447817f15c07af2ed5fd5cd7274fe75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:48:06.244701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:48:06.245650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3646544380/tls.crt::/tmp/serving-cert-3646544380/tls.key\\\\\\\"\\\\nI0929 09:48:11.481977 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:48:11.503782 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:48:11.503832 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:48:11.504092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:48:11.504106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:48:11.530654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:48:11.530688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530694 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:48:11.530698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:48:11.530701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:48:11.530705 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:48:11.530708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:48:11.531606 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:48:11.536842 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32102327ea5ae296567dc1e6dfd969d0c7e89115ace440253e87dbee87f5088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8595636a8ea6cfbb3762c513de966de2d3d23d8efa7f5e4e8f4d91d4fe8bc7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.499814 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.499861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.499872 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.499888 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.499900 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.505572 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40abd6643455ee919eb5c25efd2dd4126b42c7435bcb5a4161ef86ed2bb49ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.519404 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.531774 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lfjwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4ba2043-c805-45e4-8a8c-aff311ac3ea5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83c801b7cecf7adbdcb0454830dd00fe07edcf1f22e4b4bc1d52a1d49c15802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5jxw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lfjwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.552070 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01bb1c54-d2f0-498e-ad60-8216c29b843d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:11Z\\\",\\\"message\\\":\\\"se, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0929 09:49:11.349061 6921 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 09:49:11.346150 6921 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nF0929 09:49:11.349189 6921 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:49:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jbx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fs6qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.566035 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb72d414-a523-4f65-b189-e7128b35c535\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2215d70654d4558acf393bc3f75c191cb130c14b1a705de6f7ef040d792afa90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://352c9f8910374f43aa116300526704ebe076299397ecd20b86be658d53f38593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9a72f3a53e9bfc74e1f0bb793af51187cb6f11787a54af3e775c5a271b8b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://016dda820498b6f4f30aecbd0eded36505e3fb2a366f19a1ebd0a77eabc1b82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:47:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:47:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.582230 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.597022 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.602032 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.602078 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.602089 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.602104 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.602114 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.610045 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.623235 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.636192 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:30Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.704386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.704431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.704444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.704460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.704472 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.808477 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.809582 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.809727 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.809983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.810251 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.914589 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.915549 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.915713 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.915881 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:30 crc kubenswrapper[4891]: I0929 09:49:30.915965 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:30Z","lastTransitionTime":"2025-09-29T09:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.019820 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.020495 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.020641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.020846 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.021017 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.125257 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.125378 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.125408 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.125440 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.125463 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.229095 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.229169 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.229184 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.229205 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.229220 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.332470 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.332510 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.332520 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.332560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.332573 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.395442 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.395537 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.395482 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.395457 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:31 crc kubenswrapper[4891]: E0929 09:49:31.395888 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:31 crc kubenswrapper[4891]: E0929 09:49:31.396021 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:31 crc kubenswrapper[4891]: E0929 09:49:31.396102 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:31 crc kubenswrapper[4891]: E0929 09:49:31.396152 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.436069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.436126 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.436142 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.436172 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.436199 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.539622 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.539694 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.539707 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.539727 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.539741 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.643364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.643413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.643424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.643441 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.643452 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.745983 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.746231 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.746245 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.746261 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.746272 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.848594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.848645 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.848654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.848667 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.848677 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.951153 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.951197 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.951213 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.951229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:31 crc kubenswrapper[4891]: I0929 09:49:31.951239 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:31Z","lastTransitionTime":"2025-09-29T09:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.054290 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.054341 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.054355 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.054372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.054420 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.156733 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.156771 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.156783 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.156836 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.156848 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.294713 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.294757 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.294769 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.294801 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.294814 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.397125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.397160 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.397170 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.397186 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.397194 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.503345 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.503403 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.503415 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.503435 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.503450 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.606203 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.606249 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.606259 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.606276 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.606291 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.708997 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.709054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.709066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.709083 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.709093 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.812385 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.812422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.812435 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.812449 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.812459 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.915250 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.915301 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.915319 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.915350 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:32 crc kubenswrapper[4891]: I0929 09:49:32.915367 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:32Z","lastTransitionTime":"2025-09-29T09:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.018159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.018207 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.018219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.018240 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.018255 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.121337 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.121396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.121409 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.121431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.121445 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.224372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.224451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.224469 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.224494 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.224512 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.326908 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.326959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.326972 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.326991 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.327006 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.395742 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.395872 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.395962 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.395900 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:33 crc kubenswrapper[4891]: E0929 09:49:33.396157 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:33 crc kubenswrapper[4891]: E0929 09:49:33.396372 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:33 crc kubenswrapper[4891]: E0929 09:49:33.396727 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:33 crc kubenswrapper[4891]: E0929 09:49:33.396810 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.429219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.429273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.429285 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.429309 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.429320 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.532671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.532737 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.532749 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.532768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.532779 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.636360 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.636413 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.636423 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.636446 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.636461 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.739427 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.739474 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.739485 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.739501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.739512 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.841971 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.842043 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.842054 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.842081 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.842100 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.944758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.944855 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.944870 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.944890 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:33 crc kubenswrapper[4891]: I0929 09:49:33.944902 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:33Z","lastTransitionTime":"2025-09-29T09:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.047684 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.047742 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.047759 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.047785 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.047839 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.151229 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.151315 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.151332 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.151404 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.151426 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.253968 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.254009 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.254019 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.254033 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.254045 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.356372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.356420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.356429 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.356445 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.356455 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.459258 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.459300 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.459308 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.459325 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.459337 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.561456 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.561513 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.561522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.561541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.561550 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.664657 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.664715 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.664725 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.664738 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.664748 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.767514 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.767570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.767581 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.767599 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.767612 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.870475 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.870526 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.870536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.870549 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.870565 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.972678 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.972718 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.972728 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.972742 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:34 crc kubenswrapper[4891]: I0929 09:49:34.972752 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:34Z","lastTransitionTime":"2025-09-29T09:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.076175 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.076264 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.076279 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.076295 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.076305 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.178551 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.178594 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.178605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.178621 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.178632 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.281633 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.281669 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.281678 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.281693 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.281702 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.384293 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.384369 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.384393 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.384423 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.384448 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.395534 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.395574 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.395543 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.395548 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:35 crc kubenswrapper[4891]: E0929 09:49:35.395724 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:35 crc kubenswrapper[4891]: E0929 09:49:35.395672 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:35 crc kubenswrapper[4891]: E0929 09:49:35.395868 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:35 crc kubenswrapper[4891]: E0929 09:49:35.395925 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.487434 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.487492 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.487503 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.487521 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.487533 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.590260 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.590304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.590314 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.590330 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.590342 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.693174 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.693216 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.693227 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.693244 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.693257 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.795522 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.795572 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.795590 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.795611 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.795624 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.899173 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.899260 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.899296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.899318 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:35 crc kubenswrapper[4891]: I0929 09:49:35.899329 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:35Z","lastTransitionTime":"2025-09-29T09:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.002180 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.002236 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.002249 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.002273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.002287 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.104239 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.104301 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.104318 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.104342 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.104361 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.208408 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.208517 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.208529 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.208549 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.208560 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.311422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.311477 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.311486 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.311505 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.311516 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.415346 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.415396 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.415405 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.415424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.415434 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.518465 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.518540 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.518554 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.518570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.518582 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.621275 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.621320 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.621335 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.621353 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.621369 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.723971 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.724011 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.724020 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.724035 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.724045 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.826157 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.826233 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.826242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.826256 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.826266 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.928418 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.928464 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.928477 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.928494 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:36 crc kubenswrapper[4891]: I0929 09:49:36.928506 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:36Z","lastTransitionTime":"2025-09-29T09:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.031429 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.031480 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.031490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.031508 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.031519 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.134339 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.134389 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.134404 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.134427 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.134445 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.237432 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.237492 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.237504 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.237527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.237541 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.340563 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.340600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.340617 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.340631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.340644 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.395724 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.395782 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:37 crc kubenswrapper[4891]: E0929 09:49:37.395866 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:37 crc kubenswrapper[4891]: E0929 09:49:37.396012 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.396097 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.396277 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:37 crc kubenswrapper[4891]: E0929 09:49:37.396709 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:37 crc kubenswrapper[4891]: E0929 09:49:37.397360 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.398044 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:49:37 crc kubenswrapper[4891]: E0929 09:49:37.398267 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fs6qf_openshift-ovn-kubernetes(01bb1c54-d2f0-498e-ad60-8216c29b843d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.412259 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.443425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.443484 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.443494 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.443512 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.443523 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.546467 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.546536 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.546550 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.546570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.546584 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.649835 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.649878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.649887 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.649905 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.649915 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.753110 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.753202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.753219 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.753242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.753253 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.856483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.856569 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.856583 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.856616 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.856633 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.959878 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.959938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.959948 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.959965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:37 crc kubenswrapper[4891]: I0929 09:49:37.959974 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:37Z","lastTransitionTime":"2025-09-29T09:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.062498 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.062532 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.062540 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.062556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.062569 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.164825 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.164914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.164929 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.164946 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.164958 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.267092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.267119 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.267127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.267142 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.267150 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.370153 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.370222 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.370235 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.370258 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.370276 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.473631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.473687 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.473701 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.473721 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.473736 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.576366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.576412 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.576431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.576451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.576467 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.678980 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.679023 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.679042 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.679061 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.679076 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.782097 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.782133 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.782142 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.782159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.782167 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.885100 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.885152 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.885162 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.885180 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.885191 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.988230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.988281 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.988292 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.988313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:38 crc kubenswrapper[4891]: I0929 09:49:38.988329 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:38Z","lastTransitionTime":"2025-09-29T09:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.090579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.090624 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.090636 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.090653 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.090668 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.194488 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.194580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.194605 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.194644 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.194670 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.297777 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.297884 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.297899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.297914 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.297924 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.395455 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.395509 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:39 crc kubenswrapper[4891]: E0929 09:49:39.395624 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.395455 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.395752 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:39 crc kubenswrapper[4891]: E0929 09:49:39.395834 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:39 crc kubenswrapper[4891]: E0929 09:49:39.395905 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:39 crc kubenswrapper[4891]: E0929 09:49:39.396215 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.400449 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.400516 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.400530 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.400556 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.400566 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.503682 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.503756 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.503773 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.503831 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.503849 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.607290 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.607367 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.607384 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.607410 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.607424 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.710183 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.710238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.710248 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.710270 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.710283 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.813226 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.813271 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.813281 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.813304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.813315 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.916807 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.916861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.916872 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.916899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:39 crc kubenswrapper[4891]: I0929 09:49:39.916914 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:39Z","lastTransitionTime":"2025-09-29T09:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.000430 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.000521 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.000545 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.000579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.000602 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.016616 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.022386 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.022454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.022468 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.022488 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.022503 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.039912 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.043892 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.043928 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.043937 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.043958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.043969 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.058784 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.063610 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.063654 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.063663 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.063680 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.063691 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.076747 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.081304 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.081343 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.081356 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.081376 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.081390 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.095771 4891 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37d954c2-3d94-47a1-be48-2d150f56c63a\\\",\\\"systemUUID\\\":\\\"4df1a8e1-540a-4a1c-bb3d-0ff769533bc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: E0929 09:49:40.095977 4891 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.098005 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.098069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.098083 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.098111 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.098126 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.202295 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.202364 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.202381 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.202405 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.202421 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.306092 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.306147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.306159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.306184 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.306197 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.408372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.408420 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.408432 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.408451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.408464 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.411170 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.427037 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ngmm4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bfce090-366c-43be-ab12-d291b4d25217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:49:01Z\\\",\\\"message\\\":\\\"2025-09-29T09:48:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3\\\\n2025-09-29T09:48:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f2aa4435-037a-4e86-86a6-e636459566a3 to /host/opt/cni/bin/\\\\n2025-09-29T09:48:16Z [verbose] multus-daemon started\\\\n2025-09-29T09:48:16Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:49:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4994w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ngmm4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.446110 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d682f7f1-c1f5-46e1-827a-c9cdfeb82a3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8e2d6cbf8094ad80a59ac77acecc3d3b453ff6e083be109d487323b5f5b086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99025c4e60270a6f01ff50353f030c413b100dbfd82048888be97035e4164558\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39266331b93041617c91e33d53e9a22ae134dbfd174be104359d6627b2e6bc13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d59992f3dad0292fc9e39f3e1b14a7902c9e4cdd2a735ad0b5f87d576e53184\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88364724629e418c948ea2900b1e804422f52dbce32a7459bd0beb499a42e4ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a85fae5aa423fc421682bb195dd065e9a08e33f932ff55ca1416ddcedc2ab69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ca0d82508bab459d3e33d6bbcef633bca990ecc69de0a9c66f07e901300753\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:48:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:48:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7wscb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fhhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.462537 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d96e9ca266d44b5dea51c7aef6ec4d0890c73a5650a3b9e07a0506ef932d613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.479759 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.492528 4891 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11edc92e-b224-4b6a-a4a8-4ccf9e696341\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05d4724eeb028b7c59a9f5513ffeff71868422b3d9bb94b00d4039aa38bcd44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3ca05cc8ef5370d7df062aa2c7d068a5ac74c2431726ff01b95ab9b35400d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6k98\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:48:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vxv4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:49:40Z is after 2025-08-24T17:21:41Z" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.511179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.511222 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.511234 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.511253 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.511266 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.553171 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.553145872 podStartE2EDuration="1m28.553145872s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.551000237 +0000 UTC m=+110.756168578" watchObservedRunningTime="2025-09-29 09:49:40.553145872 +0000 UTC m=+110.758314193" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.553400 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.553395259 podStartE2EDuration="25.553395259s" podCreationTimestamp="2025-09-29 09:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.533427049 +0000 UTC m=+110.738595390" watchObservedRunningTime="2025-09-29 09:49:40.553395259 +0000 UTC m=+110.758563580" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.595122 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podStartSLOduration=89.590631369 podStartE2EDuration="1m29.590631369s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.588572137 +0000 UTC m=+110.793740468" watchObservedRunningTime="2025-09-29 09:49:40.590631369 +0000 UTC m=+110.795799720" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.614490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.614545 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.614560 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.614580 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.614593 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.620733 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5nmcv" podStartSLOduration=88.620710124 podStartE2EDuration="1m28.620710124s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.606844667 +0000 UTC m=+110.812013008" watchObservedRunningTime="2025-09-29 09:49:40.620710124 +0000 UTC m=+110.825878445" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.650674 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lfjwh" podStartSLOduration=89.650649705 podStartE2EDuration="1m29.650649705s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.621455347 +0000 UTC m=+110.826623668" watchObservedRunningTime="2025-09-29 09:49:40.650649705 +0000 UTC m=+110.855818026" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.696592 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.696567516 podStartE2EDuration="56.696567516s" podCreationTimestamp="2025-09-29 09:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.668316226 +0000 UTC m=+110.873484547" watchObservedRunningTime="2025-09-29 09:49:40.696567516 +0000 UTC m=+110.901735837" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.697002 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.696995049 podStartE2EDuration="3.696995049s" podCreationTimestamp="2025-09-29 09:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.696279587 +0000 UTC m=+110.901447928" watchObservedRunningTime="2025-09-29 09:49:40.696995049 +0000 UTC m=+110.902163380" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.715314 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.715287859 podStartE2EDuration="1m28.715287859s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:40.71465475 +0000 UTC m=+110.919823081" watchObservedRunningTime="2025-09-29 09:49:40.715287859 +0000 UTC m=+110.920456180" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.716960 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.717022 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.717036 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.717062 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.717076 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.819734 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.819777 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.819803 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.819820 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.819832 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.922031 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.922090 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.922104 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.922125 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:40 crc kubenswrapper[4891]: I0929 09:49:40.922142 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:40Z","lastTransitionTime":"2025-09-29T09:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.025429 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.025490 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.025501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.025523 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.025538 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.128599 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.128650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.128662 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.128685 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.128698 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.232368 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.232412 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.232425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.232441 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.232453 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.335671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.335710 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.335719 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.335734 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.335744 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.395432 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.395490 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.395495 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.395441 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:41 crc kubenswrapper[4891]: E0929 09:49:41.395586 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:41 crc kubenswrapper[4891]: E0929 09:49:41.395768 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:41 crc kubenswrapper[4891]: E0929 09:49:41.395822 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:41 crc kubenswrapper[4891]: E0929 09:49:41.395907 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.438384 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.438429 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.438439 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.438454 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.438462 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.541528 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.541627 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.541650 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.541668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.541681 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.644861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.644916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.644927 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.644943 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.644958 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.747736 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.748211 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.748319 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.748414 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.748513 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.851112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.851614 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.851707 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.851834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.851915 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.954464 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.954496 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.954506 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.954520 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:41 crc kubenswrapper[4891]: I0929 09:49:41.954533 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:41Z","lastTransitionTime":"2025-09-29T09:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.058768 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.058875 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.058889 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.058917 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.058932 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.161752 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.161816 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.161834 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.161857 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.161873 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.264190 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.264248 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.264258 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.264273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.264284 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.367338 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.367383 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.367394 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.367411 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.367422 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.469519 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.470074 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.470197 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.470339 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.470447 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.572835 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.572875 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.572885 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.572902 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.572911 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.675282 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.675346 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.675357 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.675372 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.675382 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.777291 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.777407 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.777422 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.777436 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.777447 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.879459 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.879515 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.879527 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.879541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.879551 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.982374 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.982753 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.982850 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.982925 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:42 crc kubenswrapper[4891]: I0929 09:49:42.983045 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:42Z","lastTransitionTime":"2025-09-29T09:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.086264 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.087012 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.087070 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.087105 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.087119 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.189617 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.189658 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.189668 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.189683 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.189692 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.291982 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.292025 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.292036 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.292051 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.292060 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.394831 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.394882 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:43 crc kubenswrapper[4891]: E0929 09:49:43.395030 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395064 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395073 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395090 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395120 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395131 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395147 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: E0929 09:49:43.395163 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.395160 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: E0929 09:49:43.395316 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:43 crc kubenswrapper[4891]: E0929 09:49:43.395371 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.497849 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.497907 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.497920 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.497938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.497949 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.601058 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.601103 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.601112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.601126 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.601137 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.704242 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.704283 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.704294 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.704313 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.704331 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.806501 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.806534 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.806544 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.806558 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.806568 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.909677 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.909741 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.909758 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.909781 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:43 crc kubenswrapper[4891]: I0929 09:49:43.909835 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:43Z","lastTransitionTime":"2025-09-29T09:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.012609 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.012646 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.012656 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.012671 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.012682 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.114738 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.114777 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.114803 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.114864 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.114877 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.218681 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.219376 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.219460 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.219538 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.219596 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.322280 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.322515 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.322628 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.322714 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.322780 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.425424 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.425483 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.425494 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.425510 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.425520 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.527666 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.527706 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.527717 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.527732 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.527741 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.630241 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.630276 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.630284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.630296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.630305 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.732307 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.732360 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.732371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.732387 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.732400 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.834086 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.834127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.834136 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.834148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.834157 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.936504 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.936542 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.936552 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.936572 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:44 crc kubenswrapper[4891]: I0929 09:49:44.936583 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:44Z","lastTransitionTime":"2025-09-29T09:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.039406 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.039441 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.039451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.039466 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.039476 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.142500 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.142555 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.142565 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.142584 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.142594 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.245747 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.245832 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.245850 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.245870 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.245886 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.350069 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.350116 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.350127 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.350148 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.350159 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.394923 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.394968 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.394973 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.395052 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:45 crc kubenswrapper[4891]: E0929 09:49:45.395077 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:45 crc kubenswrapper[4891]: E0929 09:49:45.395190 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:45 crc kubenswrapper[4891]: E0929 09:49:45.395259 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:45 crc kubenswrapper[4891]: E0929 09:49:45.395337 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.452673 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.452720 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.452729 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.452748 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.452760 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.555268 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.555351 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.555371 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.555407 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.555432 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.658377 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.658444 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.658468 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.658495 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.658512 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.761215 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.761263 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.761273 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.761290 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.761302 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.864963 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.865057 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.865093 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.865130 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.865156 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.968968 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.969047 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.969063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.969149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:45 crc kubenswrapper[4891]: I0929 09:49:45.969173 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:45Z","lastTransitionTime":"2025-09-29T09:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.071971 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.072034 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.072048 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.072066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.072079 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.173952 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.173996 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.174008 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.174024 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.174040 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.277451 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.278410 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.278428 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.278447 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.278458 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.380694 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.380745 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.380754 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.380770 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.380780 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.483366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.483417 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.483431 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.483448 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.483460 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.585916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.585958 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.585968 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.585984 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.585994 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.688414 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.688457 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.688466 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.688482 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.688493 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.791296 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.791336 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.791346 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.791361 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.791371 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.894018 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.894063 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.894072 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.894084 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.894094 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.996875 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.996915 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.996924 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.996938 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:46 crc kubenswrapper[4891]: I0929 09:49:46.996948 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:46Z","lastTransitionTime":"2025-09-29T09:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.099428 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.099473 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.099487 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.099502 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.099514 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.202188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.202230 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.202238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.202255 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.202421 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.304547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.304600 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.304614 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.304631 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.304642 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.395045 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.395095 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.395026 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.395240 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:47 crc kubenswrapper[4891]: E0929 09:49:47.395309 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:47 crc kubenswrapper[4891]: E0929 09:49:47.395424 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:47 crc kubenswrapper[4891]: E0929 09:49:47.395497 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:47 crc kubenswrapper[4891]: E0929 09:49:47.395547 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.406547 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.406607 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.406615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.406627 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.406636 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.509899 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.509935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.509945 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.509962 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.509977 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.612541 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.612570 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.612578 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.612591 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.612600 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.715238 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.715275 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.715288 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.715308 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.715320 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.817900 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.817935 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.817944 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.817957 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.817967 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.920757 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.920812 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.920823 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.920840 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:47 crc kubenswrapper[4891]: I0929 09:49:47.920850 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:47Z","lastTransitionTime":"2025-09-29T09:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.023579 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.023630 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.023641 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.023656 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.023668 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.050716 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/1.log" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.051329 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/0.log" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.051395 4891 generic.go:334] "Generic (PLEG): container finished" podID="4bfce090-366c-43be-ab12-d291b4d25217" containerID="d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8" exitCode=1 Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.051433 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerDied","Data":"d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.051470 4891 scope.go:117] "RemoveContainer" containerID="0173520fd902f3ea795edd992dbd586ee8444c867b1cf44534a55c68e7fcec1c" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.052401 4891 scope.go:117] "RemoveContainer" containerID="d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8" Sep 29 09:49:48 crc kubenswrapper[4891]: E0929 09:49:48.052633 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ngmm4_openshift-multus(4bfce090-366c-43be-ab12-d291b4d25217)\"" pod="openshift-multus/multus-ngmm4" podUID="4bfce090-366c-43be-ab12-d291b4d25217" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.089771 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5fhhd" podStartSLOduration=97.089743134 podStartE2EDuration="1m37.089743134s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:48.088111525 +0000 UTC m=+118.293279846" watchObservedRunningTime="2025-09-29 09:49:48.089743134 +0000 UTC m=+118.294911455" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.127366 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.127399 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.127419 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.127436 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.127448 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.152190 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vxv4k" podStartSLOduration=96.152159681 podStartE2EDuration="1m36.152159681s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:48.151520252 +0000 UTC m=+118.356688583" watchObservedRunningTime="2025-09-29 09:49:48.152159681 +0000 UTC m=+118.357328002" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.230176 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.230246 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.230257 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.230283 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.230293 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.333179 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.333223 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.333232 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.333247 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.333261 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.436566 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.436633 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.436645 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.436661 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.436672 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.539425 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.539476 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.539486 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.539500 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.539523 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.642202 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.642234 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.642243 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.642278 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.642290 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.745260 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.745327 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.745340 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.745359 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.745387 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.848188 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.848255 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.848267 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.848284 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.848295 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.951916 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.951965 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.951975 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.951988 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:48 crc kubenswrapper[4891]: I0929 09:49:48.951997 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:48Z","lastTransitionTime":"2025-09-29T09:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.054551 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.054593 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.054601 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.054615 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.054624 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.057188 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/1.log" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.157224 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.157277 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.157292 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.157307 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.157316 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.259893 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.259936 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.259947 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.259964 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.259974 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.363110 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.363160 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.363172 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.363192 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.363205 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.394725 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.394824 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.394892 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:49 crc kubenswrapper[4891]: E0929 09:49:49.395022 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.395067 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:49 crc kubenswrapper[4891]: E0929 09:49:49.395125 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:49 crc kubenswrapper[4891]: E0929 09:49:49.395227 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:49 crc kubenswrapper[4891]: E0929 09:49:49.395286 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.466336 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.466379 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.466391 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.466408 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.466421 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.569262 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.569302 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.569312 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.569328 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.569340 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.672003 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.672066 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.672084 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.672112 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.672130 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.775149 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.775193 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.775204 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.775220 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.775230 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.877540 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.877601 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.877629 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.877645 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.877654 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.980102 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.980138 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.980146 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.980159 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:49 crc kubenswrapper[4891]: I0929 09:49:49.980168 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:49Z","lastTransitionTime":"2025-09-29T09:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.082737 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.082774 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.082785 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.082819 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.082831 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:50Z","lastTransitionTime":"2025-09-29T09:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.163861 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.163927 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.163939 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.163959 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.163975 4891 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:49:50Z","lastTransitionTime":"2025-09-29T09:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.204548 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg"] Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.204966 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.208810 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.208861 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.210593 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.210783 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.306252 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.306317 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3400dad5-83e7-4653-9786-973aeeb42438-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.306343 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3400dad5-83e7-4653-9786-973aeeb42438-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.306389 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3400dad5-83e7-4653-9786-973aeeb42438-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.306432 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: E0929 09:49:50.392915 4891 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407473 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3400dad5-83e7-4653-9786-973aeeb42438-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407527 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3400dad5-83e7-4653-9786-973aeeb42438-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407553 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407597 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407623 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3400dad5-83e7-4653-9786-973aeeb42438-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407674 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.407674 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3400dad5-83e7-4653-9786-973aeeb42438-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.409485 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.409518 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.418578 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3400dad5-83e7-4653-9786-973aeeb42438-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.421827 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.423573 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3400dad5-83e7-4653-9786-973aeeb42438-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.435044 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3400dad5-83e7-4653-9786-973aeeb42438-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxgpg\" (UID: \"3400dad5-83e7-4653-9786-973aeeb42438\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.520290 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 29 09:49:50 crc kubenswrapper[4891]: E0929 09:49:50.522611 4891 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:49:50 crc kubenswrapper[4891]: I0929 09:49:50.527590 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.065530 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" event={"ID":"3400dad5-83e7-4653-9786-973aeeb42438","Type":"ContainerStarted","Data":"9c57284a064b17714188a1a7fb34118faa783a2ee7d737ca930e35408612ee02"} Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.065854 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" event={"ID":"3400dad5-83e7-4653-9786-973aeeb42438","Type":"ContainerStarted","Data":"e8f8c15be67a9d7ce9120827e895f6ff09312ae0c603de0b157b9b9334abe8eb"} Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.080313 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxgpg" podStartSLOduration=100.080294515 podStartE2EDuration="1m40.080294515s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:51.0787735 +0000 UTC m=+121.283941821" watchObservedRunningTime="2025-09-29 09:49:51.080294515 +0000 UTC m=+121.285462836" Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.395199 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:51 crc kubenswrapper[4891]: E0929 09:49:51.395348 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.395380 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.395429 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:51 crc kubenswrapper[4891]: I0929 09:49:51.395448 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:51 crc kubenswrapper[4891]: E0929 09:49:51.395504 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:51 crc kubenswrapper[4891]: E0929 09:49:51.395603 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:51 crc kubenswrapper[4891]: E0929 09:49:51.395698 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:52 crc kubenswrapper[4891]: I0929 09:49:52.395852 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.076102 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/3.log" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.079641 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerStarted","Data":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.080147 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.130228 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podStartSLOduration=102.130211505 podStartE2EDuration="1m42.130211505s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:53.12871371 +0000 UTC m=+123.333882051" watchObservedRunningTime="2025-09-29 09:49:53.130211505 +0000 UTC m=+123.335379826" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.395238 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.395296 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:53 crc kubenswrapper[4891]: E0929 09:49:53.395391 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.395241 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:53 crc kubenswrapper[4891]: E0929 09:49:53.395503 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.395444 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:53 crc kubenswrapper[4891]: E0929 09:49:53.395754 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:53 crc kubenswrapper[4891]: E0929 09:49:53.395729 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:53 crc kubenswrapper[4891]: I0929 09:49:53.586480 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6thmw"] Sep 29 09:49:54 crc kubenswrapper[4891]: I0929 09:49:54.083119 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:54 crc kubenswrapper[4891]: E0929 09:49:54.083256 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:55 crc kubenswrapper[4891]: I0929 09:49:55.394758 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:55 crc kubenswrapper[4891]: I0929 09:49:55.394758 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:55 crc kubenswrapper[4891]: I0929 09:49:55.394782 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:55 crc kubenswrapper[4891]: E0929 09:49:55.395524 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:55 crc kubenswrapper[4891]: E0929 09:49:55.395533 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:55 crc kubenswrapper[4891]: E0929 09:49:55.395552 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:55 crc kubenswrapper[4891]: E0929 09:49:55.524616 4891 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:49:56 crc kubenswrapper[4891]: I0929 09:49:56.395403 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:56 crc kubenswrapper[4891]: E0929 09:49:56.395579 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:57 crc kubenswrapper[4891]: I0929 09:49:57.395577 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:57 crc kubenswrapper[4891]: E0929 09:49:57.395695 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:57 crc kubenswrapper[4891]: I0929 09:49:57.395917 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:57 crc kubenswrapper[4891]: I0929 09:49:57.395930 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:57 crc kubenswrapper[4891]: E0929 09:49:57.395987 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:57 crc kubenswrapper[4891]: E0929 09:49:57.396377 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:49:58 crc kubenswrapper[4891]: I0929 09:49:58.395627 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:49:58 crc kubenswrapper[4891]: E0929 09:49:58.395824 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:49:59 crc kubenswrapper[4891]: I0929 09:49:59.395566 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:49:59 crc kubenswrapper[4891]: I0929 09:49:59.395607 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:49:59 crc kubenswrapper[4891]: I0929 09:49:59.395645 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:49:59 crc kubenswrapper[4891]: E0929 09:49:59.396101 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:49:59 crc kubenswrapper[4891]: E0929 09:49:59.396167 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:49:59 crc kubenswrapper[4891]: I0929 09:49:59.396184 4891 scope.go:117] "RemoveContainer" containerID="d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8" Sep 29 09:49:59 crc kubenswrapper[4891]: E0929 09:49:59.396323 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:50:00 crc kubenswrapper[4891]: I0929 09:50:00.107962 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/1.log" Sep 29 09:50:00 crc kubenswrapper[4891]: I0929 09:50:00.108550 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerStarted","Data":"bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3"} Sep 29 09:50:00 crc kubenswrapper[4891]: I0929 09:50:00.128495 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ngmm4" podStartSLOduration=109.128430191 podStartE2EDuration="1m49.128430191s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:00.128363579 +0000 UTC m=+130.333531900" watchObservedRunningTime="2025-09-29 09:50:00.128430191 +0000 UTC m=+130.333598522" Sep 29 09:50:00 crc kubenswrapper[4891]: I0929 09:50:00.395746 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:00 crc kubenswrapper[4891]: E0929 09:50:00.397177 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:50:00 crc kubenswrapper[4891]: E0929 09:50:00.525766 4891 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:50:01 crc kubenswrapper[4891]: I0929 09:50:01.395006 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:01 crc kubenswrapper[4891]: I0929 09:50:01.395119 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:01 crc kubenswrapper[4891]: I0929 09:50:01.395205 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:01 crc kubenswrapper[4891]: E0929 09:50:01.395197 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:50:01 crc kubenswrapper[4891]: E0929 09:50:01.395339 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:50:01 crc kubenswrapper[4891]: E0929 09:50:01.395549 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:50:02 crc kubenswrapper[4891]: I0929 09:50:02.395562 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:02 crc kubenswrapper[4891]: E0929 09:50:02.395762 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:50:03 crc kubenswrapper[4891]: I0929 09:50:03.394841 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:03 crc kubenswrapper[4891]: I0929 09:50:03.394903 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:03 crc kubenswrapper[4891]: I0929 09:50:03.394978 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:03 crc kubenswrapper[4891]: E0929 09:50:03.395029 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:50:03 crc kubenswrapper[4891]: E0929 09:50:03.395124 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:50:03 crc kubenswrapper[4891]: E0929 09:50:03.395252 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:50:04 crc kubenswrapper[4891]: I0929 09:50:04.395217 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:04 crc kubenswrapper[4891]: E0929 09:50:04.395464 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6thmw" podUID="45417d1e-e3f1-4cc9-9f51-65affc9d72f6" Sep 29 09:50:05 crc kubenswrapper[4891]: I0929 09:50:05.395511 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:05 crc kubenswrapper[4891]: I0929 09:50:05.395595 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:05 crc kubenswrapper[4891]: I0929 09:50:05.395704 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:05 crc kubenswrapper[4891]: E0929 09:50:05.395743 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:50:05 crc kubenswrapper[4891]: E0929 09:50:05.395920 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:50:05 crc kubenswrapper[4891]: E0929 09:50:05.395999 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:50:06 crc kubenswrapper[4891]: I0929 09:50:06.394866 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:06 crc kubenswrapper[4891]: I0929 09:50:06.397842 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 29 09:50:06 crc kubenswrapper[4891]: I0929 09:50:06.397895 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.395529 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.395647 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.395542 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.398901 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.399739 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.399929 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 29 09:50:07 crc kubenswrapper[4891]: I0929 09:50:07.399873 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.276027 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.748021 4891 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.793375 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.794070 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.794095 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fs2sv"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.794738 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.796674 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hjlz"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.797600 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.798244 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.798847 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.798319 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.798275 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.806955 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.807447 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.807722 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.807838 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.807936 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808025 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808323 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808447 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808570 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808740 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808865 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809032 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809222 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809383 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809486 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809044 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809048 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809073 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809123 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.810655 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809909 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.811000 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.811014 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.808448 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.810673 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809967 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.809941 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.811201 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.810909 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.811459 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.811777 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.817232 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.817358 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-l9jdp"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.818149 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.820721 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.827260 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.827417 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.827673 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.827819 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.827970 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.828299 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-28nrn"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.828707 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.829125 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.836164 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.837041 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.837858 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.838472 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.838876 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839610 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839647 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839769 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839823 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839879 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.839918 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840021 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840066 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840170 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840214 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840281 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840354 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840383 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840484 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840519 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840616 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840656 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840741 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840840 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840971 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840986 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841091 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840028 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.840616 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841219 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841296 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pkk4x"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841752 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841785 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.841841 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.854417 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.856112 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.856970 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.857052 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.857166 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.857427 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.859010 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.860500 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.861100 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa68a099-1736-4f9a-bcaf-9840257afaeb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.861214 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.861293 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthck\" (UniqueName: \"kubernetes.io/projected/303b949b-a531-46fa-a69d-6cc909009fc4-kube-api-access-sthck\") pod \"downloads-7954f5f757-l9jdp\" (UID: \"303b949b-a531-46fa-a69d-6cc909009fc4\") " pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.861325 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-policies\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.861600 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbvc\" (UniqueName: \"kubernetes.io/projected/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-kube-api-access-kgbvc\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.871256 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.871451 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-client\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.871885 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.872198 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.873560 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.874707 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.874978 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c8c8f0a-f35e-497d-b953-8b8353d2780e-machine-approver-tls\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875027 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6qx\" (UniqueName: \"kubernetes.io/projected/7c8c8f0a-f35e-497d-b953-8b8353d2780e-kube-api-access-2k6qx\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875058 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-default-certificate\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875084 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875095 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875108 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-auth-proxy-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875132 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1792c60-bbca-441c-9c02-662c476c2d74-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875160 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875181 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-config\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875201 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875223 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9j6x\" (UniqueName: \"kubernetes.io/projected/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-kube-api-access-x9j6x\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875288 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875312 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1792c60-bbca-441c-9c02-662c476c2d74-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875341 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-images\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875374 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875391 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-serving-cert\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875413 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzmpp\" (UniqueName: \"kubernetes.io/projected/570d72c8-d4ed-4b0a-876a-5a942b32a958-kube-api-access-qzmpp\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875440 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875460 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkc69\" (UniqueName: \"kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875488 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-serving-cert\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875511 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-encryption-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875530 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875541 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875552 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e320dc35-e65d-489f-b752-da6f9eda884f-service-ca-bundle\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875575 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875596 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875613 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875630 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-node-pullsecrets\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875649 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0971f343-a162-4db1-96bb-3857bd667ad2-serving-cert\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875674 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlg9\" (UniqueName: \"kubernetes.io/projected/e26e0265-569e-4929-8dd8-8b2665b37f81-kube-api-access-9dlg9\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875694 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/570d72c8-d4ed-4b0a-876a-5a942b32a958-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875721 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-metrics-certs\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875745 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-image-import-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875781 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-client\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875831 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875851 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-encryption-config\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875872 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875890 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit-dir\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875909 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8r8q\" (UniqueName: \"kubernetes.io/projected/76751bcd-3e42-47b3-bfa8-a89525f681f6-kube-api-access-n8r8q\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875925 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2mb\" (UniqueName: \"kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875941 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875959 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-dir\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875975 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.875992 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0971f343-a162-4db1-96bb-3857bd667ad2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876010 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876029 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876047 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-serving-cert\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876067 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876085 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876103 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-config\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876125 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/0971f343-a162-4db1-96bb-3857bd667ad2-kube-api-access-nf772\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876142 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhzr\" (UniqueName: \"kubernetes.io/projected/f1792c60-bbca-441c-9c02-662c476c2d74-kube-api-access-rfhzr\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876167 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-stats-auth\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876186 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gldb\" (UniqueName: \"kubernetes.io/projected/e320dc35-e65d-489f-b752-da6f9eda884f-kube-api-access-7gldb\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876202 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876218 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876235 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fg6n\" (UniqueName: \"kubernetes.io/projected/fa68a099-1736-4f9a-bcaf-9840257afaeb-kube-api-access-9fg6n\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.876833 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.877108 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.878576 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.879169 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lkthp"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.879703 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.879844 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.880063 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.886420 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.887418 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.888033 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.888537 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.889213 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894210 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894484 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894613 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894730 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894867 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.894965 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.895054 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.895425 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.897901 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.900004 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.900235 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.900744 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.901172 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.901445 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.901538 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.901641 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.901727 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.902386 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.902493 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.902634 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.902731 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.903154 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.904006 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.904397 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.904835 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.909475 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.909734 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.909876 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.911672 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.911966 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnfml"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.912604 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.912668 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.912857 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.913355 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.913726 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.914137 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.917013 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.919433 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h4vv2"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.920107 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.920366 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.920833 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.921239 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.921455 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.944402 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.951760 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.953410 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.954344 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.954753 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-llvr5"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.954932 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.956200 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.956881 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.957694 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.957699 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.963227 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.963815 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.966034 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.967842 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9t6nh"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.968610 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.971707 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.972707 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.974476 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.975942 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.976883 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl2mb\" (UniqueName: \"kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.976920 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.976950 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-dir\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.976965 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.976983 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0971f343-a162-4db1-96bb-3857bd667ad2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977003 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9sns\" (UniqueName: \"kubernetes.io/projected/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-kube-api-access-g9sns\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977025 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977041 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977057 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-serving-cert\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977072 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977090 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977106 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-config\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977123 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/0971f343-a162-4db1-96bb-3857bd667ad2-kube-api-access-nf772\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977148 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-stats-auth\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977165 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhzr\" (UniqueName: \"kubernetes.io/projected/f1792c60-bbca-441c-9c02-662c476c2d74-kube-api-access-rfhzr\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977184 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977199 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gldb\" (UniqueName: \"kubernetes.io/projected/e320dc35-e65d-489f-b752-da6f9eda884f-kube-api-access-7gldb\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977214 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977238 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977260 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fg6n\" (UniqueName: \"kubernetes.io/projected/fa68a099-1736-4f9a-bcaf-9840257afaeb-kube-api-access-9fg6n\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977281 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa68a099-1736-4f9a-bcaf-9840257afaeb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977305 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977325 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthck\" (UniqueName: \"kubernetes.io/projected/303b949b-a531-46fa-a69d-6cc909009fc4-kube-api-access-sthck\") pod \"downloads-7954f5f757-l9jdp\" (UID: \"303b949b-a531-46fa-a69d-6cc909009fc4\") " pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977343 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-policies\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977547 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbvc\" (UniqueName: \"kubernetes.io/projected/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-kube-api-access-kgbvc\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977573 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-client\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977591 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c8c8f0a-f35e-497d-b953-8b8353d2780e-machine-approver-tls\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977605 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6qx\" (UniqueName: \"kubernetes.io/projected/7c8c8f0a-f35e-497d-b953-8b8353d2780e-kube-api-access-2k6qx\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977620 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-default-certificate\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977639 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977657 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-auth-proxy-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977675 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1792c60-bbca-441c-9c02-662c476c2d74-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977691 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977710 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-config\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977752 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977772 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9j6x\" (UniqueName: \"kubernetes.io/projected/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-kube-api-access-x9j6x\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977804 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977822 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1792c60-bbca-441c-9c02-662c476c2d74-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977845 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-images\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977870 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977885 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-serving-cert\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977903 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzmpp\" (UniqueName: \"kubernetes.io/projected/570d72c8-d4ed-4b0a-876a-5a942b32a958-kube-api-access-qzmpp\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977926 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-serving-cert\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977941 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977958 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkc69\" (UniqueName: \"kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977975 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-encryption-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.977994 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978011 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e320dc35-e65d-489f-b752-da6f9eda884f-service-ca-bundle\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978028 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978042 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978058 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978074 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlg9\" (UniqueName: \"kubernetes.io/projected/e26e0265-569e-4929-8dd8-8b2665b37f81-kube-api-access-9dlg9\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978088 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-node-pullsecrets\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978103 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0971f343-a162-4db1-96bb-3857bd667ad2-serving-cert\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978120 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/570d72c8-d4ed-4b0a-876a-5a942b32a958-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978137 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-metrics-certs\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978155 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-image-import-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978180 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-client\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978196 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978213 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-encryption-config\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978237 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978265 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978286 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit-dir\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.978302 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8r8q\" (UniqueName: \"kubernetes.io/projected/76751bcd-3e42-47b3-bfa8-a89525f681f6-kube-api-access-n8r8q\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.979593 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-khp2g"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.979784 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.979869 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-dir\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.979911 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-audit-policies\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.980338 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.980365 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-service-ca-bundle\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.980403 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.983096 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-config\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.980910 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.981123 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-config\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.981426 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.983857 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.983899 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa68a099-1736-4f9a-bcaf-9840257afaeb-images\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.985318 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-image-import-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.985522 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-node-pullsecrets\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.986399 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.986835 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1792c60-bbca-441c-9c02-662c476c2d74-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.986884 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.986915 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6lfmx"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.981643 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.988045 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hjlz"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.988166 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.980848 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0971f343-a162-4db1-96bb-3857bd667ad2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.989071 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.989176 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.982441 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.989254 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76751bcd-3e42-47b3-bfa8-a89525f681f6-audit-dir\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.989274 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.989523 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990049 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990083 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fs2sv"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990260 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-serving-cert\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990396 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990652 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.990990 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-client\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.991823 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76751bcd-3e42-47b3-bfa8-a89525f681f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.992044 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-llvr5"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.992743 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.992984 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e320dc35-e65d-489f-b752-da6f9eda884f-service-ca-bundle\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.993140 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.993329 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.993380 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.993552 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c8c8f0a-f35e-497d-b953-8b8353d2780e-auth-proxy-config\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.993565 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa68a099-1736-4f9a-bcaf-9840257afaeb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.994602 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.995078 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0971f343-a162-4db1-96bb-3857bd667ad2-serving-cert\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.995158 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-etcd-client\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.995260 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l9jdp"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.996317 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.996491 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-serving-cert\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.996531 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76751bcd-3e42-47b3-bfa8-a89525f681f6-encryption-config\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.996542 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c8c8f0a-f35e-497d-b953-8b8353d2780e-machine-approver-tls\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.997108 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/570d72c8-d4ed-4b0a-876a-5a942b32a958-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.997215 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-default-certificate\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.997513 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.999004 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z"] Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.999077 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-metrics-certs\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:10 crc kubenswrapper[4891]: I0929 09:50:10.999588 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.001918 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.002011 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.002058 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.002707 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-serving-cert\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.003971 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pkk4x"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.005218 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.007066 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e26e0265-569e-4929-8dd8-8b2665b37f81-encryption-config\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.007971 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e320dc35-e65d-489f-b752-da6f9eda884f-stats-auth\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.008299 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.008359 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.009152 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.013140 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.015875 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.017479 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.026646 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.026926 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.026948 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.030780 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.034850 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.038271 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.038611 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.038752 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h4vv2"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.043442 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.045427 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnfml"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.047258 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.048326 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.077884 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-khp2g"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.077986 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.078013 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.078644 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.079060 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.079219 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lkthp"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.080259 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.080336 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9sns\" (UniqueName: \"kubernetes.io/projected/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-kube-api-access-g9sns\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.080406 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.094129 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mzzff"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.099007 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.101398 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.101465 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6lfmx"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.101550 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.102932 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1792c60-bbca-441c-9c02-662c476c2d74-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.103488 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.103527 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.104275 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jvjwz"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.106667 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.108102 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mzzff"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.112994 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.114259 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.115352 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9t6nh"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.116575 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.117226 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.120040 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9nwpf"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.121533 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9nwpf"] Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.121672 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.136728 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.157142 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.177093 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.198018 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.216912 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.243253 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.256827 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.277220 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.298093 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.317819 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.336175 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.356072 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.376668 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.398270 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.417478 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.438566 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.458259 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.477380 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.498656 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.525762 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.537541 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.556760 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.577200 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.598611 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.618323 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.637321 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.656951 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.676938 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.697078 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.717094 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.737640 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.758538 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.795567 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.802593 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.817422 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.837164 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.857419 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.877058 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.904188 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.917162 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.936676 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.955378 4891 request.go:700] Waited for 1.00069447s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.958404 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.976834 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 29 09:50:11 crc kubenswrapper[4891]: I0929 09:50:11.997665 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.017096 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.037488 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.057054 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.077032 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.097026 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.117084 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.136997 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.158156 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.177580 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.198235 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.217191 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.295074 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.295117 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.297390 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.317307 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.336755 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.357828 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.377215 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.409817 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.417491 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.437313 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.483928 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8r8q\" (UniqueName: \"kubernetes.io/projected/76751bcd-3e42-47b3-bfa8-a89525f681f6-kube-api-access-n8r8q\") pod \"apiserver-76f77b778f-fs2sv\" (UID: \"76751bcd-3e42-47b3-bfa8-a89525f681f6\") " pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.495349 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl2mb\" (UniqueName: \"kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb\") pod \"console-f9d7485db-4nznm\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.514307 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6qx\" (UniqueName: \"kubernetes.io/projected/7c8c8f0a-f35e-497d-b953-8b8353d2780e-kube-api-access-2k6qx\") pod \"machine-approver-56656f9798-pvm8n\" (UID: \"7c8c8f0a-f35e-497d-b953-8b8353d2780e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.533501 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbvc\" (UniqueName: \"kubernetes.io/projected/ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb-kube-api-access-kgbvc\") pod \"authentication-operator-69f744f599-pkk4x\" (UID: \"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.557723 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gldb\" (UniqueName: \"kubernetes.io/projected/e320dc35-e65d-489f-b752-da6f9eda884f-kube-api-access-7gldb\") pod \"router-default-5444994796-28nrn\" (UID: \"e320dc35-e65d-489f-b752-da6f9eda884f\") " pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.573600 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fg6n\" (UniqueName: \"kubernetes.io/projected/fa68a099-1736-4f9a-bcaf-9840257afaeb-kube-api-access-9fg6n\") pod \"machine-api-operator-5694c8668f-4hjlz\" (UID: \"fa68a099-1736-4f9a-bcaf-9840257afaeb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.596321 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9j6x\" (UniqueName: \"kubernetes.io/projected/3ce4282f-8661-4ec8-894e-ec921b4c6f4c-kube-api-access-x9j6x\") pod \"kube-storage-version-migrator-operator-b67b599dd-mhz5r\" (UID: \"3ce4282f-8661-4ec8-894e-ec921b4c6f4c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.597420 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.626025 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.633870 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/0971f343-a162-4db1-96bb-3857bd667ad2-kube-api-access-nf772\") pod \"openshift-config-operator-7777fb866f-cc2bw\" (UID: \"0971f343-a162-4db1-96bb-3857bd667ad2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.638900 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.658800 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthck\" (UniqueName: \"kubernetes.io/projected/303b949b-a531-46fa-a69d-6cc909009fc4-kube-api-access-sthck\") pod \"downloads-7954f5f757-l9jdp\" (UID: \"303b949b-a531-46fa-a69d-6cc909009fc4\") " pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.674036 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlg9\" (UniqueName: \"kubernetes.io/projected/e26e0265-569e-4929-8dd8-8b2665b37f81-kube-api-access-9dlg9\") pod \"apiserver-7bbb656c7d-lzjgr\" (UID: \"e26e0265-569e-4929-8dd8-8b2665b37f81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.685317 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.695920 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhzr\" (UniqueName: \"kubernetes.io/projected/f1792c60-bbca-441c-9c02-662c476c2d74-kube-api-access-rfhzr\") pod \"openshift-apiserver-operator-796bbdcf4f-dm6pg\" (UID: \"f1792c60-bbca-441c-9c02-662c476c2d74\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.696699 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 29 09:50:12 crc kubenswrapper[4891]: W0929 09:50:12.709058 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8c8f0a_f35e_497d_b953_8b8353d2780e.slice/crio-d6c7b3c4fc3848a68b3165fa5c71a1391c729dc8607e12e4140ac97d37d771d1 WatchSource:0}: Error finding container d6c7b3c4fc3848a68b3165fa5c71a1391c729dc8607e12e4140ac97d37d771d1: Status 404 returned error can't find the container with id d6c7b3c4fc3848a68b3165fa5c71a1391c729dc8607e12e4140ac97d37d771d1 Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.717406 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.743488 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.745523 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.757653 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.758236 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.772804 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.782144 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.813555 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkc69\" (UniqueName: \"kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69\") pod \"controller-manager-879f6c89f-7tk8t\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.816183 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzmpp\" (UniqueName: \"kubernetes.io/projected/570d72c8-d4ed-4b0a-876a-5a942b32a958-kube-api-access-qzmpp\") pod \"control-plane-machine-set-operator-78cbb6b69f-bspfg\" (UID: \"570d72c8-d4ed-4b0a-876a-5a942b32a958\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.818663 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.838468 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.839810 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.852269 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pkk4x"] Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.858852 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.869237 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:12 crc kubenswrapper[4891]: W0929 09:50:12.869725 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode320dc35_e65d_489f_b752_da6f9eda884f.slice/crio-6233979e1aa6974ceffe7555f407ab8e405a98d54e47ec87eeb5a7b3590a1ffc WatchSource:0}: Error finding container 6233979e1aa6974ceffe7555f407ab8e405a98d54e47ec87eeb5a7b3590a1ffc: Status 404 returned error can't find the container with id 6233979e1aa6974ceffe7555f407ab8e405a98d54e47ec87eeb5a7b3590a1ffc Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.882842 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 29 09:50:12 crc kubenswrapper[4891]: W0929 09:50:12.890514 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6a0da2_cbc9_45bc_a213_3a2e87fbdbeb.slice/crio-05de4bd8d0863b7a5357b9f6e02ec46c64872a3354f8c3563a3ef2967a6ef1da WatchSource:0}: Error finding container 05de4bd8d0863b7a5357b9f6e02ec46c64872a3354f8c3563a3ef2967a6ef1da: Status 404 returned error can't find the container with id 05de4bd8d0863b7a5357b9f6e02ec46c64872a3354f8c3563a3ef2967a6ef1da Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.896611 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.902818 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fs2sv"] Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.906000 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.913903 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.921676 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.925458 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.938019 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.974841 4891 request.go:700] Waited for 1.893295982s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.974951 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.997244 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 29 09:50:12 crc kubenswrapper[4891]: I0929 09:50:12.999013 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9sns\" (UniqueName: \"kubernetes.io/projected/96fa581a-e2fa-4c16-a646-4ac94eec1ef0-kube-api-access-g9sns\") pod \"openshift-controller-manager-operator-756b6f6bc6-tf4wv\" (UID: \"96fa581a-e2fa-4c16-a646-4ac94eec1ef0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.014070 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4hjlz"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.017290 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.037539 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.060380 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.084650 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.104418 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.112992 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l9jdp"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.120527 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.120921 4891 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.139188 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.158200 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.162647 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.173701 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" event={"ID":"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb","Type":"ContainerStarted","Data":"05de4bd8d0863b7a5357b9f6e02ec46c64872a3354f8c3563a3ef2967a6ef1da"} Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.184385 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28nrn" event={"ID":"e320dc35-e65d-489f-b752-da6f9eda884f","Type":"ContainerStarted","Data":"6233979e1aa6974ceffe7555f407ab8e405a98d54e47ec87eeb5a7b3590a1ffc"} Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.186870 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l9jdp" event={"ID":"303b949b-a531-46fa-a69d-6cc909009fc4","Type":"ContainerStarted","Data":"c47da1cf440ae9e0ec8faac624f09e1dee3b8c251270841ae50ff4519e9d402f"} Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209193 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-webhook-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209232 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7389ac7-6063-4742-8f3a-bcf2b0721522-metrics-tls\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209257 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwt75\" (UniqueName: \"kubernetes.io/projected/36a6681d-099f-47d1-91a8-da5c2c510a21-kube-api-access-mwt75\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209286 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2017d3b8-cade-41e1-91ce-962c506c4c31-tmpfs\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209307 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209325 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6mt\" (UniqueName: \"kubernetes.io/projected/a36bae20-4dc0-4377-ad84-9cb27c982c88-kube-api-access-fz6mt\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209353 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209368 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7389ac7-6063-4742-8f3a-bcf2b0721522-trusted-ca\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209397 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5dbaded-b4ab-4c6d-b825-885f87113f5b-metrics-tls\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209413 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-config\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209428 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-client\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209445 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209470 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ca8f6f-22af-4326-88a8-e29b3dc62384-config\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209491 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d0c31e3-682d-4758-927e-1e293d39630c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209514 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nxs\" (UniqueName: \"kubernetes.io/projected/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-kube-api-access-k4nxs\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209530 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-key\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209548 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dadc724-8c40-4857-ab4a-bd4dbb58963b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209565 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdsdl\" (UniqueName: \"kubernetes.io/projected/9231f127-f250-40fa-b818-7b01584701d2-kube-api-access-rdsdl\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209609 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dadc724-8c40-4857-ab4a-bd4dbb58963b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209627 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209651 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209669 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp8pj\" (UniqueName: \"kubernetes.io/projected/b5dbaded-b4ab-4c6d-b825-885f87113f5b-kube-api-access-bp8pj\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209693 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94twf\" (UniqueName: \"kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209716 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxnw4\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209737 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sv4w\" (UniqueName: \"kubernetes.io/projected/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-kube-api-access-8sv4w\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209780 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a36bae20-4dc0-4377-ad84-9cb27c982c88-proxy-tls\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209810 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209836 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209860 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209885 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209941 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209961 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.209976 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210011 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l25s\" (UniqueName: \"kubernetes.io/projected/2017d3b8-cade-41e1-91ce-962c506c4c31-kube-api-access-7l25s\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210026 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0c31e3-682d-4758-927e-1e293d39630c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210072 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210087 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-service-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210103 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ca8f6f-22af-4326-88a8-e29b3dc62384-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210119 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210144 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a36bae20-4dc0-4377-ad84-9cb27c982c88-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210187 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-trusted-ca\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210206 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnwh\" (UniqueName: \"kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210224 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-apiservice-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210242 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srp57\" (UniqueName: \"kubernetes.io/projected/588d9386-5914-4ca4-bea2-fcb06e0de4b3-kube-api-access-srp57\") pod \"migrator-59844c95c7-54mnr\" (UID: \"588d9386-5914-4ca4-bea2-fcb06e0de4b3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210257 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e702ccf-0d4b-4ad0-bb27-16bc39303992-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210272 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210315 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210347 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4f4\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-kube-api-access-nw4f4\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210367 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210384 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210426 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e702ccf-0d4b-4ad0-bb27-16bc39303992-config\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210443 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-config\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210459 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e702ccf-0d4b-4ad0-bb27-16bc39303992-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210475 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9231f127-f250-40fa-b818-7b01584701d2-proxy-tls\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210490 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-images\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210506 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210520 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d0c31e3-682d-4758-927e-1e293d39630c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210536 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210550 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210566 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ca8f6f-22af-4326-88a8-e29b3dc62384-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210581 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210666 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210682 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4vbn\" (UniqueName: \"kubernetes.io/projected/1afb1091-dfff-4abe-a642-83e586b66b4e-kube-api-access-h4vbn\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210698 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210716 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210733 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-cabundle\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210829 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afb1091-dfff-4abe-a642-83e586b66b4e-serving-cert\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210853 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsnrd\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-kube-api-access-zsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210869 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-serving-cert\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210884 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210900 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.210925 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.217545 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" event={"ID":"fa68a099-1736-4f9a-bcaf-9840257afaeb","Type":"ContainerStarted","Data":"0c11ee155a1cd1bbec0241844cdf281e0b0a2d8a5edca2143f84e84289f3d856"} Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.219875 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:13.719856829 +0000 UTC m=+143.925025150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.227211 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.228281 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" event={"ID":"7c8c8f0a-f35e-497d-b953-8b8353d2780e","Type":"ContainerStarted","Data":"d6c7b3c4fc3848a68b3165fa5c71a1391c729dc8607e12e4140ac97d37d771d1"} Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.233201 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" event={"ID":"76751bcd-3e42-47b3-bfa8-a89525f681f6","Type":"ContainerStarted","Data":"c90f6913f5aa30522dc0dd04414ebb9272a3376e630522ded78a2ee60b35d456"} Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.251540 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.313718 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.320828 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afb1091-dfff-4abe-a642-83e586b66b4e-serving-cert\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.320963 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsnrd\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-kube-api-access-zsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.320998 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-serving-cert\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321021 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321048 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321073 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321093 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321141 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrm4\" (UniqueName: \"kubernetes.io/projected/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-kube-api-access-cjrm4\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321161 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-webhook-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321185 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7389ac7-6063-4742-8f3a-bcf2b0721522-metrics-tls\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321215 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwt75\" (UniqueName: \"kubernetes.io/projected/36a6681d-099f-47d1-91a8-da5c2c510a21-kube-api-access-mwt75\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321246 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2017d3b8-cade-41e1-91ce-962c506c4c31-tmpfs\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321266 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-csi-data-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321291 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321314 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6mt\" (UniqueName: \"kubernetes.io/projected/a36bae20-4dc0-4377-ad84-9cb27c982c88-kube-api-access-fz6mt\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321339 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblz4\" (UniqueName: \"kubernetes.io/projected/5fa727e5-e5ff-49bc-925c-eb272a95a228-kube-api-access-lblz4\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321367 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321389 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5dbaded-b4ab-4c6d-b825-885f87113f5b-metrics-tls\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321412 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7389ac7-6063-4742-8f3a-bcf2b0721522-trusted-ca\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321433 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321475 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321499 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-config\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321520 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-client\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321553 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ca8f6f-22af-4326-88a8-e29b3dc62384-config\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321590 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d0c31e3-682d-4758-927e-1e293d39630c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321619 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nxs\" (UniqueName: \"kubernetes.io/projected/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-kube-api-access-k4nxs\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321649 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-srv-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321679 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-key\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321706 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321746 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dadc724-8c40-4857-ab4a-bd4dbb58963b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321770 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdsdl\" (UniqueName: \"kubernetes.io/projected/9231f127-f250-40fa-b818-7b01584701d2-kube-api-access-rdsdl\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.321838 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-certs\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323441 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323485 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dadc724-8c40-4857-ab4a-bd4dbb58963b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323517 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323547 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323576 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp8pj\" (UniqueName: \"kubernetes.io/projected/b5dbaded-b4ab-4c6d-b825-885f87113f5b-kube-api-access-bp8pj\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323614 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94twf\" (UniqueName: \"kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323648 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxnw4\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323674 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sv4w\" (UniqueName: \"kubernetes.io/projected/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-kube-api-access-8sv4w\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323734 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a36bae20-4dc0-4377-ad84-9cb27c982c88-proxy-tls\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323767 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323807 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323834 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.323868 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8ms\" (UniqueName: \"kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325497 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71c66a9c-1955-4f9e-8b23-7bd7361b8550-config-volume\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325593 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325637 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325672 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325706 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-cert\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325743 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l25s\" (UniqueName: \"kubernetes.io/projected/2017d3b8-cade-41e1-91ce-962c506c4c31-kube-api-access-7l25s\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325775 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0c31e3-682d-4758-927e-1e293d39630c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325832 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325856 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-service-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325896 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325928 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ca8f6f-22af-4326-88a8-e29b3dc62384-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325958 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.325987 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbdpj\" (UniqueName: \"kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326024 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a36bae20-4dc0-4377-ad84-9cb27c982c88-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326051 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc9xc\" (UniqueName: \"kubernetes.io/projected/71c66a9c-1955-4f9e-8b23-7bd7361b8550-kube-api-access-fc9xc\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326102 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-trusted-ca\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326132 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzz4k\" (UniqueName: \"kubernetes.io/projected/beebb3f2-0934-4828-afec-cacda471836b-kube-api-access-tzz4k\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326163 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e702ccf-0d4b-4ad0-bb27-16bc39303992-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326193 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnwh\" (UniqueName: \"kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326225 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-apiservice-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326250 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srp57\" (UniqueName: \"kubernetes.io/projected/588d9386-5914-4ca4-bea2-fcb06e0de4b3-kube-api-access-srp57\") pod \"migrator-59844c95c7-54mnr\" (UID: \"588d9386-5914-4ca4-bea2-fcb06e0de4b3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326282 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326314 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-node-bootstrap-token\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326355 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326387 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326417 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-plugins-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326455 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4f4\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-kube-api-access-nw4f4\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326512 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa727e5-e5ff-49bc-925c-eb272a95a228-config\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326537 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326559 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-socket-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326581 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwxc\" (UniqueName: \"kubernetes.io/projected/00c254c1-9578-492b-8e89-2272cd24b565-kube-api-access-6xwxc\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326604 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/17df4c88-f73c-4521-87b8-211b77833d96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326621 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-kube-api-access-qfskw\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326650 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e702ccf-0d4b-4ad0-bb27-16bc39303992-config\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326669 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-config\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326691 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-srv-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326706 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326729 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e702ccf-0d4b-4ad0-bb27-16bc39303992-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9231f127-f250-40fa-b818-7b01584701d2-proxy-tls\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326770 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwkb\" (UniqueName: \"kubernetes.io/projected/17df4c88-f73c-4521-87b8-211b77833d96-kube-api-access-bxwkb\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326807 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa727e5-e5ff-49bc-925c-eb272a95a228-serving-cert\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.326827 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.329023 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.329628 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-images\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.329672 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.329906 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d0c31e3-682d-4758-927e-1e293d39630c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.330036 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.330773 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-serving-cert\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.330062 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ca8f6f-22af-4326-88a8-e29b3dc62384-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331404 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331438 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-registration-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331465 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-mountpoint-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331490 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331514 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71c66a9c-1955-4f9e-8b23-7bd7361b8550-metrics-tls\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331539 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggddg\" (UniqueName: \"kubernetes.io/projected/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-kube-api-access-ggddg\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.331565 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.332907 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4vbn\" (UniqueName: \"kubernetes.io/projected/1afb1091-dfff-4abe-a642-83e586b66b4e-kube-api-access-h4vbn\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.332948 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.332981 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c7pz\" (UniqueName: \"kubernetes.io/projected/d8e4b574-6c94-42fe-9e04-651a2f252e8e-kube-api-access-5c7pz\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.333233 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.333577 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7389ac7-6063-4742-8f3a-bcf2b0721522-metrics-tls\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.334093 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-cabundle\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.334847 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.335051 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-cabundle\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.335355 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7389ac7-6063-4742-8f3a-bcf2b0721522-trusted-ca\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.335453 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-service-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.335660 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.336067 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afb1091-dfff-4abe-a642-83e586b66b4e-serving-cert\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.336343 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.337218 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.337590 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.338344 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-config\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.340156 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-trusted-ca\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.341927 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-ca\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.342204 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:13.842157901 +0000 UTC m=+144.047326222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.342348 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2017d3b8-cade-41e1-91ce-962c506c4c31-tmpfs\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.342535 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-images\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.343145 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.343252 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afb1091-dfff-4abe-a642-83e586b66b4e-config\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.343524 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.343570 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.344109 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9231f127-f250-40fa-b818-7b01584701d2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.344459 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2dadc724-8c40-4857-ab4a-bd4dbb58963b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.344543 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a36bae20-4dc0-4377-ad84-9cb27c982c88-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.344829 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d0c31e3-682d-4758-927e-1e293d39630c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.344893 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.345599 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ca8f6f-22af-4326-88a8-e29b3dc62384-config\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.345631 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.345862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36a6681d-099f-47d1-91a8-da5c2c510a21-signing-key\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.347548 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e702ccf-0d4b-4ad0-bb27-16bc39303992-config\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.348572 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.348562 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.348804 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-webhook-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.349162 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.349202 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dadc724-8c40-4857-ab4a-bd4dbb58963b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.349318 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0c31e3-682d-4758-927e-1e293d39630c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.349335 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e702ccf-0d4b-4ad0-bb27-16bc39303992-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.349759 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.352071 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.352710 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.348183 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5dbaded-b4ab-4c6d-b825-885f87113f5b-metrics-tls\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353035 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353108 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2017d3b8-cade-41e1-91ce-962c506c4c31-apiservice-cert\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353476 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353490 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9231f127-f250-40fa-b818-7b01584701d2-proxy-tls\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353511 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-etcd-client\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.353846 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ca8f6f-22af-4326-88a8-e29b3dc62384-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.356851 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a36bae20-4dc0-4377-ad84-9cb27c982c88-proxy-tls\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.358312 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.378489 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6mt\" (UniqueName: \"kubernetes.io/projected/a36bae20-4dc0-4377-ad84-9cb27c982c88-kube-api-access-fz6mt\") pod \"machine-config-controller-84d6567774-n6x5w\" (UID: \"a36bae20-4dc0-4377-ad84-9cb27c982c88\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.378765 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.381223 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.418034 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4vbn\" (UniqueName: \"kubernetes.io/projected/1afb1091-dfff-4abe-a642-83e586b66b4e-kube-api-access-h4vbn\") pod \"console-operator-58897d9998-jnfml\" (UID: \"1afb1091-dfff-4abe-a642-83e586b66b4e\") " pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435659 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-node-bootstrap-token\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435719 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-plugins-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435824 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa727e5-e5ff-49bc-925c-eb272a95a228-config\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435873 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-socket-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435900 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwxc\" (UniqueName: \"kubernetes.io/projected/00c254c1-9578-492b-8e89-2272cd24b565-kube-api-access-6xwxc\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435926 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/17df4c88-f73c-4521-87b8-211b77833d96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435949 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-kube-api-access-qfskw\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.435980 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-srv-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436004 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436050 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa727e5-e5ff-49bc-925c-eb272a95a228-serving-cert\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436075 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwkb\" (UniqueName: \"kubernetes.io/projected/17df4c88-f73c-4521-87b8-211b77833d96-kube-api-access-bxwkb\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436109 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-registration-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436134 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-mountpoint-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436160 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71c66a9c-1955-4f9e-8b23-7bd7361b8550-metrics-tls\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436174 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-plugins-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436185 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggddg\" (UniqueName: \"kubernetes.io/projected/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-kube-api-access-ggddg\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436279 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c7pz\" (UniqueName: \"kubernetes.io/projected/d8e4b574-6c94-42fe-9e04-651a2f252e8e-kube-api-access-5c7pz\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436383 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436451 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrm4\" (UniqueName: \"kubernetes.io/projected/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-kube-api-access-cjrm4\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436485 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-csi-data-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436513 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lblz4\" (UniqueName: \"kubernetes.io/projected/5fa727e5-e5ff-49bc-925c-eb272a95a228-kube-api-access-lblz4\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436542 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.436583 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-srv-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.437315 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa727e5-e5ff-49bc-925c-eb272a95a228-config\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.437767 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-socket-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.437843 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-registration-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.438090 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-mountpoint-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.438324 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-csi-data-dir\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.440800 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l25s\" (UniqueName: \"kubernetes.io/projected/2017d3b8-cade-41e1-91ce-962c506c4c31-kube-api-access-7l25s\") pod \"packageserver-d55dfcdfc-4nmlj\" (UID: \"2017d3b8-cade-41e1-91ce-962c506c4c31\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.441372 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.442298 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443039 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-certs\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443100 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443191 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8ms\" (UniqueName: \"kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443220 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443245 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71c66a9c-1955-4f9e-8b23-7bd7361b8550-config-volume\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443306 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-cert\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443351 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443379 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc9xc\" (UniqueName: \"kubernetes.io/projected/71c66a9c-1955-4f9e-8b23-7bd7361b8550-kube-api-access-fc9xc\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443401 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbdpj\" (UniqueName: \"kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443431 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzz4k\" (UniqueName: \"kubernetes.io/projected/beebb3f2-0934-4828-afec-cacda471836b-kube-api-access-tzz4k\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443551 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71c66a9c-1955-4f9e-8b23-7bd7361b8550-metrics-tls\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.443764 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.444574 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/17df4c88-f73c-4521-87b8-211b77833d96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.446308 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/beebb3f2-0934-4828-afec-cacda471836b-srv-cert\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.446512 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:13.946494759 +0000 UTC m=+144.151663080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.452069 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-cert\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.454335 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.454773 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.454860 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa727e5-e5ff-49bc-925c-eb272a95a228-serving-cert\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.454921 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-certs\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.454959 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/00c254c1-9578-492b-8e89-2272cd24b565-node-bootstrap-token\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.455051 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.458155 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.458426 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-srv-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.464059 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71c66a9c-1955-4f9e-8b23-7bd7361b8550-config-volume\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.468481 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d8e4b574-6c94-42fe-9e04-651a2f252e8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.473029 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsnrd\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-kube-api-access-zsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.476128 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.485088 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dadc724-8c40-4857-ab4a-bd4dbb58963b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkk7z\" (UID: \"2dadc724-8c40-4857-ab4a-bd4dbb58963b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.499240 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.504373 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ca8f6f-22af-4326-88a8-e29b3dc62384-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5t75\" (UID: \"c2ca8f6f-22af-4326-88a8-e29b3dc62384\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.518630 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp8pj\" (UniqueName: \"kubernetes.io/projected/b5dbaded-b4ab-4c6d-b825-885f87113f5b-kube-api-access-bp8pj\") pod \"dns-operator-744455d44c-h4vv2\" (UID: \"b5dbaded-b4ab-4c6d-b825-885f87113f5b\") " pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: W0929 09:50:13.534584 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0971f343_a162_4db1_96bb_3857bd667ad2.slice/crio-c2fe74d678defc9a3d3209c1c46fbc18d690cae4cc72821976c6785458104487 WatchSource:0}: Error finding container c2fe74d678defc9a3d3209c1c46fbc18d690cae4cc72821976c6785458104487: Status 404 returned error can't find the container with id c2fe74d678defc9a3d3209c1c46fbc18d690cae4cc72821976c6785458104487 Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.535175 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94twf\" (UniqueName: \"kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf\") pod \"oauth-openshift-558db77b4-cd8dx\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.541251 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.545964 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.045913314 +0000 UTC m=+144.251081655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.548617 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.549625 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.550357 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.050339343 +0000 UTC m=+144.255507664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.554461 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxnw4\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.566393 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.570360 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.580109 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sv4w\" (UniqueName: \"kubernetes.io/projected/c622ed91-dbc3-4b63-bb42-a9fed92bf9ef-kube-api-access-8sv4w\") pod \"etcd-operator-b45778765-lkthp\" (UID: \"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.584899 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.594155 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d0c31e3-682d-4758-927e-1e293d39630c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9c55k\" (UID: \"2d0c31e3-682d-4758-927e-1e293d39630c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.610616 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.614607 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwt75\" (UniqueName: \"kubernetes.io/projected/36a6681d-099f-47d1-91a8-da5c2c510a21-kube-api-access-mwt75\") pod \"service-ca-9c57cc56f-llvr5\" (UID: \"36a6681d-099f-47d1-91a8-da5c2c510a21\") " pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: W0929 09:50:13.614648 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fa581a_e2fa_4c16_a646_4ac94eec1ef0.slice/crio-36cd491be2de74f14df372292ff72f35cf0ee9f191674650e9fdee5521e2c14a WatchSource:0}: Error finding container 36cd491be2de74f14df372292ff72f35cf0ee9f191674650e9fdee5521e2c14a: Status 404 returned error can't find the container with id 36cd491be2de74f14df372292ff72f35cf0ee9f191674650e9fdee5521e2c14a Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.626131 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.631629 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdsdl\" (UniqueName: \"kubernetes.io/projected/9231f127-f250-40fa-b818-7b01584701d2-kube-api-access-rdsdl\") pod \"machine-config-operator-74547568cd-jf4xk\" (UID: \"9231f127-f250-40fa-b818-7b01584701d2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.634821 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.643346 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.652756 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.653076 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.653254 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.153230379 +0000 UTC m=+144.358398700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.653839 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.655210 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.155193966 +0000 UTC m=+144.360362287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.662177 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.663889 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnwh\" (UniqueName: \"kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh\") pod \"route-controller-manager-6576b87f9c-6bzpg\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.667304 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.679905 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4f4\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-kube-api-access-nw4f4\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.696990 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e702ccf-0d4b-4ad0-bb27-16bc39303992-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g5tqf\" (UID: \"0e702ccf-0d4b-4ad0-bb27-16bc39303992\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.705843 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.713279 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.728281 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nxs\" (UniqueName: \"kubernetes.io/projected/21a1ea5d-e0be-47d8-8ed4-f2e9b040771d-kube-api-access-k4nxs\") pod \"cluster-samples-operator-665b6dd947-g6rvx\" (UID: \"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.745308 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7389ac7-6063-4742-8f3a-bcf2b0721522-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4k9wr\" (UID: \"d7389ac7-6063-4742-8f3a-bcf2b0721522\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.756556 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.758121 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.258073462 +0000 UTC m=+144.463241773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.790448 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srp57\" (UniqueName: \"kubernetes.io/projected/588d9386-5914-4ca4-bea2-fcb06e0de4b3-kube-api-access-srp57\") pod \"migrator-59844c95c7-54mnr\" (UID: \"588d9386-5914-4ca4-bea2-fcb06e0de4b3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.803570 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z"] Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.803661 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggddg\" (UniqueName: \"kubernetes.io/projected/a8d02cdd-0b15-456d-ad23-5ff1b81875a6-kube-api-access-ggddg\") pod \"ingress-canary-6lfmx\" (UID: \"a8d02cdd-0b15-456d-ad23-5ff1b81875a6\") " pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.828771 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwxc\" (UniqueName: \"kubernetes.io/projected/00c254c1-9578-492b-8e89-2272cd24b565-kube-api-access-6xwxc\") pod \"machine-config-server-jvjwz\" (UID: \"00c254c1-9578-492b-8e89-2272cd24b565\") " pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:13 crc kubenswrapper[4891]: W0929 09:50:13.834532 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dadc724_8c40_4857_ab4a_bd4dbb58963b.slice/crio-e7527c5871444a0a65fd61573e39e1d346f899532745ed3c639d344e9e9a1059 WatchSource:0}: Error finding container e7527c5871444a0a65fd61573e39e1d346f899532745ed3c639d344e9e9a1059: Status 404 returned error can't find the container with id e7527c5871444a0a65fd61573e39e1d346f899532745ed3c639d344e9e9a1059 Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.846282 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrm4\" (UniqueName: \"kubernetes.io/projected/44c6fbd1-b182-4c7a-867a-6f310ed97fdb-kube-api-access-cjrm4\") pod \"csi-hostpathplugin-9nwpf\" (UID: \"44c6fbd1-b182-4c7a-867a-6f310ed97fdb\") " pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.854988 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwkb\" (UniqueName: \"kubernetes.io/projected/17df4c88-f73c-4521-87b8-211b77833d96-kube-api-access-bxwkb\") pod \"multus-admission-controller-857f4d67dd-9t6nh\" (UID: \"17df4c88-f73c-4521-87b8-211b77833d96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.858916 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.859365 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.359349731 +0000 UTC m=+144.564518052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.861177 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.873511 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.879248 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c7pz\" (UniqueName: \"kubernetes.io/projected/d8e4b574-6c94-42fe-9e04-651a2f252e8e-kube-api-access-5c7pz\") pod \"catalog-operator-68c6474976-9nk5w\" (UID: \"d8e4b574-6c94-42fe-9e04-651a2f252e8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.889954 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.897086 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/863b27cf-013e-4f40-8b0f-3323ee3aa8ad-kube-api-access-qfskw\") pod \"package-server-manager-789f6589d5-8pm7n\" (UID: \"863b27cf-013e-4f40-8b0f-3323ee3aa8ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.897977 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.905495 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.939680 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.952116 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblz4\" (UniqueName: \"kubernetes.io/projected/5fa727e5-e5ff-49bc-925c-eb272a95a228-kube-api-access-lblz4\") pod \"service-ca-operator-777779d784-khp2g\" (UID: \"5fa727e5-e5ff-49bc-925c-eb272a95a228\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.960165 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.960236 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.460208988 +0000 UTC m=+144.665377309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.961581 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:13 crc kubenswrapper[4891]: E0929 09:50:13.962044 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.462034251 +0000 UTC m=+144.667202572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.962547 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc9xc\" (UniqueName: \"kubernetes.io/projected/71c66a9c-1955-4f9e-8b23-7bd7361b8550-kube-api-access-fc9xc\") pod \"dns-default-mzzff\" (UID: \"71c66a9c-1955-4f9e-8b23-7bd7361b8550\") " pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.964213 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzz4k\" (UniqueName: \"kubernetes.io/projected/beebb3f2-0934-4828-afec-cacda471836b-kube-api-access-tzz4k\") pod \"olm-operator-6b444d44fb-pfbzg\" (UID: \"beebb3f2-0934-4828-afec-cacda471836b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.979100 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.987443 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.987465 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8ms\" (UniqueName: \"kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms\") pod \"marketplace-operator-79b997595-96vcr\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.993524 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbdpj\" (UniqueName: \"kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj\") pod \"collect-profiles-29318985-cvhgh\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:13 crc kubenswrapper[4891]: I0929 09:50:13.995554 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.006753 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.014174 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6lfmx" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.022434 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.029940 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.038510 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.046133 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.054363 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jvjwz" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.078657 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.078850 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.578781991 +0000 UTC m=+144.783950322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.079121 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.079488 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.579469881 +0000 UTC m=+144.784638202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.080737 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.182901 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.185069 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.685032275 +0000 UTC m=+144.890200586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.187830 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.190478 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.690463933 +0000 UTC m=+144.895632254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.222713 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jnfml"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.239135 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.242580 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.247082 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.262874 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.267301 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" event={"ID":"2dadc724-8c40-4857-ab4a-bd4dbb58963b","Type":"ContainerStarted","Data":"2ca6eb28fe951ba2a73908e8edfce283863b2b07b2c1d53e7f2a59b5fb91bb98"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.267390 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" event={"ID":"2dadc724-8c40-4857-ab4a-bd4dbb58963b","Type":"ContainerStarted","Data":"e7527c5871444a0a65fd61573e39e1d346f899532745ed3c639d344e9e9a1059"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.289334 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.289532 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.789497177 +0000 UTC m=+144.994665498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.292408 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.292976 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.792967218 +0000 UTC m=+144.998135539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.299808 4891 generic.go:334] "Generic (PLEG): container finished" podID="0971f343-a162-4db1-96bb-3857bd667ad2" containerID="f66b89aaa71951fdf138de0fb03e76cbce2ccd2d6cd37dbaab63057b3838a356" exitCode=0 Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.300099 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" event={"ID":"0971f343-a162-4db1-96bb-3857bd667ad2","Type":"ContainerDied","Data":"f66b89aaa71951fdf138de0fb03e76cbce2ccd2d6cd37dbaab63057b3838a356"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.300222 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" event={"ID":"0971f343-a162-4db1-96bb-3857bd667ad2","Type":"ContainerStarted","Data":"c2fe74d678defc9a3d3209c1c46fbc18d690cae4cc72821976c6785458104487"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.304780 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.311669 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" event={"ID":"96fa581a-e2fa-4c16-a646-4ac94eec1ef0","Type":"ContainerStarted","Data":"65106fe68b6a4554391c73f14baf2873b7540b9719893339d855a5f88f0be695"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.324848 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" event={"ID":"96fa581a-e2fa-4c16-a646-4ac94eec1ef0","Type":"ContainerStarted","Data":"36cd491be2de74f14df372292ff72f35cf0ee9f191674650e9fdee5521e2c14a"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.330735 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" event={"ID":"fa68a099-1736-4f9a-bcaf-9840257afaeb","Type":"ContainerStarted","Data":"b61250a424e7d5a83103d2d6561146b7ef911ab084d6ba3ba78e39fc6c34517b"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.330814 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" event={"ID":"fa68a099-1736-4f9a-bcaf-9840257afaeb","Type":"ContainerStarted","Data":"c896f46fccca7ee7a70c754392e56c9d7652d8917f9ec60ec68e82a745bed200"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.338211 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" event={"ID":"482b69f0-36a6-4320-8ea5-9e1263400532","Type":"ContainerStarted","Data":"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.338304 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" event={"ID":"482b69f0-36a6-4320-8ea5-9e1263400532","Type":"ContainerStarted","Data":"b0c9357b891901d8cd69d68d211e2d8362e036de8011ed14abb4cf5d85fdd15d"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.340001 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.344990 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" event={"ID":"7c8c8f0a-f35e-497d-b953-8b8353d2780e","Type":"ContainerStarted","Data":"8421a5c88b8b75c96cd7873e502659fac62fa8fcf4c63a51bbab1bf10d0eb3dd"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.345057 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" event={"ID":"7c8c8f0a-f35e-497d-b953-8b8353d2780e","Type":"ContainerStarted","Data":"14acc66d1bd01b6988ca38e227dc910cd1e5b4158ee4c97c23aca7b8c7cfd839"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.346146 4891 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7tk8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.346193 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.361216 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" event={"ID":"570d72c8-d4ed-4b0a-876a-5a942b32a958","Type":"ContainerStarted","Data":"14d4f8e02d4e6b945952ce6b4d496d5bd2cfe9c3ff6279be1938d8ce1ed2b3e6"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.363188 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" event={"ID":"570d72c8-d4ed-4b0a-876a-5a942b32a958","Type":"ContainerStarted","Data":"fb6d4a36972b02acd0d3c6c2bf1ba7627b9cabf313775acc538359e60bb00baf"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.369510 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28nrn" event={"ID":"e320dc35-e65d-489f-b752-da6f9eda884f","Type":"ContainerStarted","Data":"01a7cb3ef1e86986e503f338595f3f26b73cb5c244c0f5bcff98fd6a7eb80ade"} Sep 29 09:50:14 crc kubenswrapper[4891]: W0929 09:50:14.369834 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ca8f6f_22af_4326_88a8_e29b3dc62384.slice/crio-f35a5460b49d5ac6ef9fce4a30045cf65b00772d0acacb38a52ff8d26ca2cbf8 WatchSource:0}: Error finding container f35a5460b49d5ac6ef9fce4a30045cf65b00772d0acacb38a52ff8d26ca2cbf8: Status 404 returned error can't find the container with id f35a5460b49d5ac6ef9fce4a30045cf65b00772d0acacb38a52ff8d26ca2cbf8 Sep 29 09:50:14 crc kubenswrapper[4891]: W0929 09:50:14.371517 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0c31e3_682d_4758_927e_1e293d39630c.slice/crio-f5ab3c2a1cdc57a4ce11efbee0782e9a796f0c8b3c1e545f56e28aaab9668f82 WatchSource:0}: Error finding container f5ab3c2a1cdc57a4ce11efbee0782e9a796f0c8b3c1e545f56e28aaab9668f82: Status 404 returned error can't find the container with id f5ab3c2a1cdc57a4ce11efbee0782e9a796f0c8b3c1e545f56e28aaab9668f82 Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.388039 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l9jdp" event={"ID":"303b949b-a531-46fa-a69d-6cc909009fc4","Type":"ContainerStarted","Data":"5ead6a995b2dcb442213f17b46a2cc9e2d2dafcd4868d44c46811eb6aa7ea5f7"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.388689 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.393446 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.416731 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.916693201 +0000 UTC m=+145.121861522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.417019 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.421738 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:14.917775763 +0000 UTC m=+145.122944084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.428955 4891 patch_prober.go:28] interesting pod/downloads-7954f5f757-l9jdp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.429059 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l9jdp" podUID="303b949b-a531-46fa-a69d-6cc909009fc4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.445303 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nznm" event={"ID":"4fe5a5b3-033b-4d7e-8829-65de16f908a2","Type":"ContainerStarted","Data":"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.445366 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nznm" event={"ID":"4fe5a5b3-033b-4d7e-8829-65de16f908a2","Type":"ContainerStarted","Data":"5dbb149b0ced16f2e1d3a5c0944fd81f307ef626f577dab6ab9adf0e78d91746"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.458654 4891 generic.go:334] "Generic (PLEG): container finished" podID="e26e0265-569e-4929-8dd8-8b2665b37f81" containerID="931d3a6ac6cd67e39ebcacaac0a06a4f56c2db1ddfecd5dd73aeb7a4ee105e53" exitCode=0 Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.458914 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" event={"ID":"e26e0265-569e-4929-8dd8-8b2665b37f81","Type":"ContainerDied","Data":"931d3a6ac6cd67e39ebcacaac0a06a4f56c2db1ddfecd5dd73aeb7a4ee105e53"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.458958 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" event={"ID":"e26e0265-569e-4929-8dd8-8b2665b37f81","Type":"ContainerStarted","Data":"a0b1700f575f5e1099aae92f360c3c8babb8e3a18d40a1a22f2d7a75c6f20d2f"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.464919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" event={"ID":"f1792c60-bbca-441c-9c02-662c476c2d74","Type":"ContainerStarted","Data":"2931ebcc1b113f96f5dbc43a8ec7a7523471c15b78742378ea7baceb1aa2315b"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.470137 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-l9jdp" podStartSLOduration=123.470100866 podStartE2EDuration="2m3.470100866s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:14.448837097 +0000 UTC m=+144.654005448" watchObservedRunningTime="2025-09-29 09:50:14.470100866 +0000 UTC m=+144.675269187" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.486804 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h4vv2"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.503068 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" event={"ID":"3ce4282f-8661-4ec8-894e-ec921b4c6f4c","Type":"ContainerStarted","Data":"930b876a4993e2abc11e5c9da50a52df530d133c82e979b73f5ef04455eb7279"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.503126 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" event={"ID":"3ce4282f-8661-4ec8-894e-ec921b4c6f4c","Type":"ContainerStarted","Data":"0708cddc5f17987e8e1b986b6f0322132969e67c78aa6d7aa1dff1d29ac07e60"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.527141 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.528962 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.028928969 +0000 UTC m=+145.234097290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.542558 4891 generic.go:334] "Generic (PLEG): container finished" podID="76751bcd-3e42-47b3-bfa8-a89525f681f6" containerID="22fe138d33388f59dc9670d9981d2b9ebdb97dc10de41dda7b2345fd708292bf" exitCode=0 Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.542653 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" event={"ID":"76751bcd-3e42-47b3-bfa8-a89525f681f6","Type":"ContainerDied","Data":"22fe138d33388f59dc9670d9981d2b9ebdb97dc10de41dda7b2345fd708292bf"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.554086 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" event={"ID":"ca6a0da2-cbc9-45bc-a213-3a2e87fbdbeb","Type":"ContainerStarted","Data":"a262cd06bb1696da5a666e1006d5a73b28d7ff48c5882a83584eac5aeacd4bc7"} Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.632041 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tf4wv" podStartSLOduration=122.631998891 podStartE2EDuration="2m2.631998891s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:14.603078809 +0000 UTC m=+144.808247140" watchObservedRunningTime="2025-09-29 09:50:14.631998891 +0000 UTC m=+144.837167222" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.658198 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.659611 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.159578194 +0000 UTC m=+145.364746515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.670973 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.690665 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.699310 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.722322 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-llvr5"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.723027 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" podStartSLOduration=122.723016981 podStartE2EDuration="2m2.723016981s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:14.679427732 +0000 UTC m=+144.884596063" watchObservedRunningTime="2025-09-29 09:50:14.723016981 +0000 UTC m=+144.928185302" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.733460 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bspfg" podStartSLOduration=122.733418064 podStartE2EDuration="2m2.733418064s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:14.708824288 +0000 UTC m=+144.913992609" watchObservedRunningTime="2025-09-29 09:50:14.733418064 +0000 UTC m=+144.938586405" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.736855 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lkthp"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.739230 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr"] Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.765151 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.765368 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.265331573 +0000 UTC m=+145.470499894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.765762 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.766229 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.266211729 +0000 UTC m=+145.471380060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.845932 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.867090 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.867278 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.367245081 +0000 UTC m=+145.572413392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.869296 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.869664 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.369651971 +0000 UTC m=+145.574820292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.911549 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pvm8n" podStartSLOduration=123.91153169 podStartE2EDuration="2m3.91153169s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:14.869298251 +0000 UTC m=+145.074466592" watchObservedRunningTime="2025-09-29 09:50:14.91153169 +0000 UTC m=+145.116700011" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.943217 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:14 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:14 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:14 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.943837 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:14 crc kubenswrapper[4891]: I0929 09:50:14.970377 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:14 crc kubenswrapper[4891]: E0929 09:50:14.971040 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.471015902 +0000 UTC m=+145.676184213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.055907 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6lfmx"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.072120 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.072533 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.572516808 +0000 UTC m=+145.777685129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.173595 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.174230 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.674201249 +0000 UTC m=+145.879369570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.276470 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.279559 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.779541956 +0000 UTC m=+145.984710277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.317888 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.343915 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4hjlz" podStartSLOduration=123.34388672 podStartE2EDuration="2m3.34388672s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:15.323416714 +0000 UTC m=+145.528585035" watchObservedRunningTime="2025-09-29 09:50:15.34388672 +0000 UTC m=+145.549055061" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.349824 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.358633 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9t6nh"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.377999 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.382167 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.878772616 +0000 UTC m=+146.083940937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.382909 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx"] Sep 29 09:50:15 crc kubenswrapper[4891]: W0929 09:50:15.455096 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17df4c88_f73c_4521_87b8_211b77833d96.slice/crio-b5e6817ed7092d67a4eb0fe52f915a7c0f6a316e1047f50f2304f1f793959699 WatchSource:0}: Error finding container b5e6817ed7092d67a4eb0fe52f915a7c0f6a316e1047f50f2304f1f793959699: Status 404 returned error can't find the container with id b5e6817ed7092d67a4eb0fe52f915a7c0f6a316e1047f50f2304f1f793959699 Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.480068 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.480655 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:15.980638172 +0000 UTC m=+146.185806493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.528734 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-khp2g"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.539183 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.548838 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9nwpf"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.569504 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.580877 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" event={"ID":"d102c7ee-0242-4b77-85f4-5ca86e742bf2","Type":"ContainerStarted","Data":"b11b89e665d67d37bcf8a7d70b3df726f7fad14fb21e472f639ad34a31554d7f"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.581954 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.582066 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.082039765 +0000 UTC m=+146.287208086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.584639 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.585356 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.085339381 +0000 UTC m=+146.290507892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.592998 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" event={"ID":"9231f127-f250-40fa-b818-7b01584701d2","Type":"ContainerStarted","Data":"4190ffa3ebad1177546c8ca2d5e0670b2690c97353d23962b27eca6e5dc0a87e"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.593428 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" event={"ID":"9231f127-f250-40fa-b818-7b01584701d2","Type":"ContainerStarted","Data":"fcf61d561592f89fb80686708b4a579f2876991c589a409ef83ffe004e1206cd"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.600674 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jnfml" event={"ID":"1afb1091-dfff-4abe-a642-83e586b66b4e","Type":"ContainerStarted","Data":"d5b80730a3da3771fb2bc7156584409edc6fd7cac85de5cea7b6728f273f75be"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.600716 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jnfml" event={"ID":"1afb1091-dfff-4abe-a642-83e586b66b4e","Type":"ContainerStarted","Data":"772404d0f8a78aa75a0803edf17be7d2030036f4092c8df1906cf1cd4df68ee7"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.602576 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.607924 4891 patch_prober.go:28] interesting pod/console-operator-58897d9998-jnfml container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.607985 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jnfml" podUID="1afb1091-dfff-4abe-a642-83e586b66b4e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.611529 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6lfmx" event={"ID":"a8d02cdd-0b15-456d-ad23-5ff1b81875a6","Type":"ContainerStarted","Data":"96322d929eb3b9b1f89953148af9bca99862bbeb7d443bb4650598112033e322"} Sep 29 09:50:15 crc kubenswrapper[4891]: W0929 09:50:15.627221 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863b27cf_013e_4f40_8b0f_3323ee3aa8ad.slice/crio-27440e0d4294ff2cb9ff6eb54341047de20a90dea7219355079ac94d79ab93c1 WatchSource:0}: Error finding container 27440e0d4294ff2cb9ff6eb54341047de20a90dea7219355079ac94d79ab93c1: Status 404 returned error can't find the container with id 27440e0d4294ff2cb9ff6eb54341047de20a90dea7219355079ac94d79ab93c1 Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.632377 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.633512 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" event={"ID":"443c2a5c-8366-4170-80e7-063687c1caaf","Type":"ContainerStarted","Data":"53e88b0c1ab6bfe91e8a6601b18159904f16e572e1156a1e4a7738ce814db722"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.636492 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mzzff"] Sep 29 09:50:15 crc kubenswrapper[4891]: W0929 09:50:15.645447 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod492762a9_9e7d_4095_b2f4_990f58b82d21.slice/crio-09f8e29e1c8bcd1d38ebd19ffcbfdb62e33b53c6edb3854fa30c2ae389511161 WatchSource:0}: Error finding container 09f8e29e1c8bcd1d38ebd19ffcbfdb62e33b53c6edb3854fa30c2ae389511161: Status 404 returned error can't find the container with id 09f8e29e1c8bcd1d38ebd19ffcbfdb62e33b53c6edb3854fa30c2ae389511161 Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.645915 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" event={"ID":"2d0c31e3-682d-4758-927e-1e293d39630c","Type":"ContainerStarted","Data":"6630f9b6f8f9dc4ecbbbbcd087b35bc13c34b28548839ba808fee601ff849f18"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.645967 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" event={"ID":"2d0c31e3-682d-4758-927e-1e293d39630c","Type":"ContainerStarted","Data":"f5ab3c2a1cdc57a4ce11efbee0782e9a796f0c8b3c1e545f56e28aaab9668f82"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.651485 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jvjwz" event={"ID":"00c254c1-9578-492b-8e89-2272cd24b565","Type":"ContainerStarted","Data":"902dafa720b118daa7b6433ac0cfad92a89715057499effcafe71c077f26b7cc"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.651531 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jvjwz" event={"ID":"00c254c1-9578-492b-8e89-2272cd24b565","Type":"ContainerStarted","Data":"a57c4259587d3dfc7de29ea90d62c59165019adb8755464ce0ab972081e65b3e"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.654862 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr"] Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.685899 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.686056 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.186030583 +0000 UTC m=+146.391198894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.686430 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.687413 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.187402293 +0000 UTC m=+146.392570614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.713510 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" event={"ID":"0971f343-a162-4db1-96bb-3857bd667ad2","Type":"ContainerStarted","Data":"c0db254f948e3b55bcecfda209278136cde5b36235929c3ddc8423ed771cef91"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.714214 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.748005 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-28nrn" podStartSLOduration=123.747975537 podStartE2EDuration="2m3.747975537s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:15.745428083 +0000 UTC m=+145.950596404" watchObservedRunningTime="2025-09-29 09:50:15.747975537 +0000 UTC m=+145.953143868" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.748211 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" event={"ID":"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef","Type":"ContainerStarted","Data":"4aad3e7c11b3ef8bbfe89e24513c4e357126d610424d811a0c880af4858af13a"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.788473 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.789919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" event={"ID":"f1792c60-bbca-441c-9c02-662c476c2d74","Type":"ContainerStarted","Data":"20f1dcd558849a821a2d3ad4aacbaff9ca8acde6057d322fc0f8e5b2e39c4e45"} Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.790206 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.289023062 +0000 UTC m=+146.494191383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.790420 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.791208 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.291186095 +0000 UTC m=+146.496354416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.837851 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" event={"ID":"b5dbaded-b4ab-4c6d-b825-885f87113f5b","Type":"ContainerStarted","Data":"409c7ac36d4fe909ca34dbcb0d6684885cb709f924a764e19b4163aa4565691e"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.859480 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" event={"ID":"44c6fbd1-b182-4c7a-867a-6f310ed97fdb","Type":"ContainerStarted","Data":"780eaab54c9c59f4472129b4cdc11096c335b7614ebbe52b59c0c2b248b9d701"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.873713 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:15 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:15 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:15 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.873835 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.884911 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" event={"ID":"0e702ccf-0d4b-4ad0-bb27-16bc39303992","Type":"ContainerStarted","Data":"c3825cf0af9afb21afda6e76f6ef76268dcd6e4da78ababd327283cfd324067d"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.892629 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:15 crc kubenswrapper[4891]: E0929 09:50:15.924766 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.424732564 +0000 UTC m=+146.629900885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:15 crc kubenswrapper[4891]: W0929 09:50:15.925870 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e4b574_6c94_42fe_9e04_651a2f252e8e.slice/crio-1a55672fc86654f4c8c71df92b59fcc750122f51dbeee9a76758cc80f6281915 WatchSource:0}: Error finding container 1a55672fc86654f4c8c71df92b59fcc750122f51dbeee9a76758cc80f6281915: Status 404 returned error can't find the container with id 1a55672fc86654f4c8c71df92b59fcc750122f51dbeee9a76758cc80f6281915 Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.940057 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" event={"ID":"2017d3b8-cade-41e1-91ce-962c506c4c31","Type":"ContainerStarted","Data":"e57144ceab3f1dd512b7015cee0b1d6c554b30663b8d648ae1d738178af521a6"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.940146 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" event={"ID":"2017d3b8-cade-41e1-91ce-962c506c4c31","Type":"ContainerStarted","Data":"ab2ac1a0724cb9c6dedba04628905eb3c5b5778710b80140c0104a0dc29e51cd"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.942700 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.952727 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" event={"ID":"017fcc94-98f6-4abd-9954-c3212676f6e7","Type":"ContainerStarted","Data":"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.952828 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" event={"ID":"017fcc94-98f6-4abd-9954-c3212676f6e7","Type":"ContainerStarted","Data":"560872879b41c041248bbe68368ebaf67ba7ad25b27bf24df1be5e1367d888f5"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.954206 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.958332 4891 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4nmlj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.958417 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" podUID="2017d3b8-cade-41e1-91ce-962c506c4c31" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.965713 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4nznm" podStartSLOduration=124.951766442 podStartE2EDuration="2m4.951766442s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:15.873311137 +0000 UTC m=+146.078479458" watchObservedRunningTime="2025-09-29 09:50:15.951766442 +0000 UTC m=+146.156934763" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.971467 4891 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cd8dx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.971553 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.987770 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" event={"ID":"a36bae20-4dc0-4377-ad84-9cb27c982c88","Type":"ContainerStarted","Data":"d5c681858a7eaa1ef83661e4ec7647020a4e38516459d51304ee082b8d7eb1de"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.987892 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" event={"ID":"a36bae20-4dc0-4377-ad84-9cb27c982c88","Type":"ContainerStarted","Data":"32077a8d08aaa46a93a0ba102a347f3f93551fb9c25faaa14ba4f64534673197"} Sep 29 09:50:15 crc kubenswrapper[4891]: I0929 09:50:15.989481 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jvjwz" podStartSLOduration=5.989437559 podStartE2EDuration="5.989437559s" podCreationTimestamp="2025-09-29 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:15.934668074 +0000 UTC m=+146.139836415" watchObservedRunningTime="2025-09-29 09:50:15.989437559 +0000 UTC m=+146.194605900" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.007755 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.009341 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.509303117 +0000 UTC m=+146.714471438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.014026 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" event={"ID":"17df4c88-f73c-4521-87b8-211b77833d96","Type":"ContainerStarted","Data":"b5e6817ed7092d67a4eb0fe52f915a7c0f6a316e1047f50f2304f1f793959699"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.036219 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" podStartSLOduration=125.03618265 podStartE2EDuration="2m5.03618265s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.011942504 +0000 UTC m=+146.217110825" watchObservedRunningTime="2025-09-29 09:50:16.03618265 +0000 UTC m=+146.241350991" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.066327 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" event={"ID":"36a6681d-099f-47d1-91a8-da5c2c510a21","Type":"ContainerStarted","Data":"04f5aded92e6d2fb879418a625ddc919b213e2e8b234c4fdf57d30ef561399e9"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.068090 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pkk4x" podStartSLOduration=124.068057778 podStartE2EDuration="2m4.068057778s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.040531836 +0000 UTC m=+146.245700177" watchObservedRunningTime="2025-09-29 09:50:16.068057778 +0000 UTC m=+146.273226109" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.102493 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dm6pg" podStartSLOduration=124.10246375 podStartE2EDuration="2m4.10246375s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.101982806 +0000 UTC m=+146.307151147" watchObservedRunningTime="2025-09-29 09:50:16.10246375 +0000 UTC m=+146.307632081" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.119045 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.120846 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.620779833 +0000 UTC m=+146.825948154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.132748 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" event={"ID":"c2ca8f6f-22af-4326-88a8-e29b3dc62384","Type":"ContainerStarted","Data":"274ab6ecd49b8d2ba706959f5ffd224be834b73be6697b6a750705811180b792"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.132840 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" event={"ID":"c2ca8f6f-22af-4326-88a8-e29b3dc62384","Type":"ContainerStarted","Data":"f35a5460b49d5ac6ef9fce4a30045cf65b00772d0acacb38a52ff8d26ca2cbf8"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.145598 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" event={"ID":"5fa727e5-e5ff-49bc-925c-eb272a95a228","Type":"ContainerStarted","Data":"8ba256a379e2170ae6142d95263044e9e16a953e5429cb090822495ea99cc423"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.136513 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9c55k" podStartSLOduration=124.136488491 podStartE2EDuration="2m4.136488491s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.134357099 +0000 UTC m=+146.339525440" watchObservedRunningTime="2025-09-29 09:50:16.136488491 +0000 UTC m=+146.341656832" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.166625 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" event={"ID":"beebb3f2-0934-4828-afec-cacda471836b","Type":"ContainerStarted","Data":"70a38a9c49f0016f06078569c24bf1d7ea9db8ba1f9e9a7165ebdc718bf88bf6"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.167881 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.180007 4891 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pfbzg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.180073 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" podUID="beebb3f2-0934-4828-afec-cacda471836b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199163 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" event={"ID":"588d9386-5914-4ca4-bea2-fcb06e0de4b3","Type":"ContainerStarted","Data":"76ba4758dd4eca99b8b88f66ae775ba27ab8bbf57662ec147ba28d395029e3a5"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199210 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" event={"ID":"588d9386-5914-4ca4-bea2-fcb06e0de4b3","Type":"ContainerStarted","Data":"675774015eece3ddc7ad3f531b269dc82a38612bf8283b0f3e4e4af8dda01b60"} Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199423 4891 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7tk8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199479 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199881 4891 patch_prober.go:28] interesting pod/downloads-7954f5f757-l9jdp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.199924 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l9jdp" podUID="303b949b-a531-46fa-a69d-6cc909009fc4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.208125 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jnfml" podStartSLOduration=125.208097786 podStartE2EDuration="2m5.208097786s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.207382615 +0000 UTC m=+146.412550936" watchObservedRunningTime="2025-09-29 09:50:16.208097786 +0000 UTC m=+146.413266107" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.208961 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mhz5r" podStartSLOduration=124.208953051 podStartE2EDuration="2m4.208953051s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.168345338 +0000 UTC m=+146.373513669" watchObservedRunningTime="2025-09-29 09:50:16.208953051 +0000 UTC m=+146.414121372" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.221484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.224973 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.724947777 +0000 UTC m=+146.930116278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.260047 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" podStartSLOduration=124.260007838 podStartE2EDuration="2m4.260007838s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.25837239 +0000 UTC m=+146.463540721" watchObservedRunningTime="2025-09-29 09:50:16.260007838 +0000 UTC m=+146.465176159" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.288104 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" podStartSLOduration=124.288080735 podStartE2EDuration="2m4.288080735s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.286829729 +0000 UTC m=+146.491998060" watchObservedRunningTime="2025-09-29 09:50:16.288080735 +0000 UTC m=+146.493249066" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.316036 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkk7z" podStartSLOduration=124.316006618 podStartE2EDuration="2m4.316006618s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.31434774 +0000 UTC m=+146.519516061" watchObservedRunningTime="2025-09-29 09:50:16.316006618 +0000 UTC m=+146.521174959" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.323199 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.323518 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.823488886 +0000 UTC m=+147.028657197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.325446 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.328777 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.828757 +0000 UTC m=+147.033925491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.334259 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" podStartSLOduration=124.334237839 podStartE2EDuration="2m4.334237839s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.334105065 +0000 UTC m=+146.539273386" watchObservedRunningTime="2025-09-29 09:50:16.334237839 +0000 UTC m=+146.539406160" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.371221 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5t75" podStartSLOduration=124.371177175 podStartE2EDuration="2m4.371177175s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.358165326 +0000 UTC m=+146.563333657" watchObservedRunningTime="2025-09-29 09:50:16.371177175 +0000 UTC m=+146.576345506" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.403378 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" podStartSLOduration=124.403354472 podStartE2EDuration="2m4.403354472s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:16.401163058 +0000 UTC m=+146.606331389" watchObservedRunningTime="2025-09-29 09:50:16.403354472 +0000 UTC m=+146.608522793" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.426654 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.427470 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:16.927453144 +0000 UTC m=+147.132621455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.529498 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.530180 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.030152634 +0000 UTC m=+147.235321085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.632412 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.633322 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.133274297 +0000 UTC m=+147.338442618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.735671 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.736228 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.236206725 +0000 UTC m=+147.441375046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.838933 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.839058 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.339028809 +0000 UTC m=+147.544197120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.839281 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.839821 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.339811742 +0000 UTC m=+147.544980063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.855041 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:16 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:16 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:16 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.855105 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.940761 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.940926 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.440884385 +0000 UTC m=+147.646052706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:16 crc kubenswrapper[4891]: I0929 09:50:16.947937 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:16 crc kubenswrapper[4891]: E0929 09:50:16.948535 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.448517017 +0000 UTC m=+147.653685338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.050491 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.051608 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.551584958 +0000 UTC m=+147.756753279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.153714 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.154390 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.654349371 +0000 UTC m=+147.859517842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.211942 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-llvr5" event={"ID":"36a6681d-099f-47d1-91a8-da5c2c510a21","Type":"ContainerStarted","Data":"62d3ecc6d72731dadf13e6ef1d941d947703b9194d42955d680563b38b7cae33"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.215334 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" event={"ID":"5fa727e5-e5ff-49bc-925c-eb272a95a228","Type":"ContainerStarted","Data":"309f58b85ec93b7368b6f8f82020ac1483019a7a273bfd854fcbc11bde1cc29b"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.221734 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" event={"ID":"17df4c88-f73c-4521-87b8-211b77833d96","Type":"ContainerStarted","Data":"9ff9e1eb0125195486bb2d5d8542d287cd55c24de9633aeba689b553e6e6bf16"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.221806 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" event={"ID":"17df4c88-f73c-4521-87b8-211b77833d96","Type":"ContainerStarted","Data":"4ed9c0ca64d4a29e83f83a832d084bae5095d2ebeda6a74b51b64436b69910db"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.223346 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" event={"ID":"d7389ac7-6063-4742-8f3a-bcf2b0721522","Type":"ContainerStarted","Data":"7b07e08c49dd7d4762c0a0bfed0e3efb795e3f924a542343ce6063fdb4fea74e"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.223376 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" event={"ID":"d7389ac7-6063-4742-8f3a-bcf2b0721522","Type":"ContainerStarted","Data":"c6ad321957ee97cd0becb678b6efeb3c1d4353efb545635e7662cefb72cca8ea"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.223386 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" event={"ID":"d7389ac7-6063-4742-8f3a-bcf2b0721522","Type":"ContainerStarted","Data":"213bc0a5ead2f114e69fbd81f47140e9ee848e265977632752d083543d88c367"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.228776 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzzff" event={"ID":"71c66a9c-1955-4f9e-8b23-7bd7361b8550","Type":"ContainerStarted","Data":"e79da714568379ce8002f2a04b3007558e318c332660b426f2881bc29254cce3"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.228859 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzzff" event={"ID":"71c66a9c-1955-4f9e-8b23-7bd7361b8550","Type":"ContainerStarted","Data":"cd7578b4594dde36340efb35cd6dd7f56b2285108ee28ce15de92f171c2af8a0"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.230520 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" event={"ID":"d8e4b574-6c94-42fe-9e04-651a2f252e8e","Type":"ContainerStarted","Data":"a923e6b2f2978fc18ad617305df9329e03fb2d7f3083f019759fb0f9440df889"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.230559 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" event={"ID":"d8e4b574-6c94-42fe-9e04-651a2f252e8e","Type":"ContainerStarted","Data":"1a55672fc86654f4c8c71df92b59fcc750122f51dbeee9a76758cc80f6281915"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.231756 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.238550 4891 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9nk5w container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.238634 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" podUID="d8e4b574-6c94-42fe-9e04-651a2f252e8e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.241517 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-khp2g" podStartSLOduration=125.241489359 podStartE2EDuration="2m5.241489359s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.239181651 +0000 UTC m=+147.444349982" watchObservedRunningTime="2025-09-29 09:50:17.241489359 +0000 UTC m=+147.446657700" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.247277 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" event={"ID":"443c2a5c-8366-4170-80e7-063687c1caaf","Type":"ContainerStarted","Data":"77cbba8bbf10cd7adc0d37679db0ce0c3b360ceb2e5f3f1efa0147177f7e80f7"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.258524 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.258746 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.75870042 +0000 UTC m=+147.963868741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.259011 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.259495 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.759466382 +0000 UTC m=+147.964634703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.261180 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" podStartSLOduration=125.261162411 podStartE2EDuration="2m5.261162411s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.260313207 +0000 UTC m=+147.465481538" watchObservedRunningTime="2025-09-29 09:50:17.261162411 +0000 UTC m=+147.466330732" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.272144 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" event={"ID":"a36bae20-4dc0-4377-ad84-9cb27c982c88","Type":"ContainerStarted","Data":"f51b15b4e67ab277b030d88e71b28c7b7826893026277b3d4517b22fd30fb83a"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.317779 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4k9wr" podStartSLOduration=125.317746749 podStartE2EDuration="2m5.317746749s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.317140331 +0000 UTC m=+147.522308662" watchObservedRunningTime="2025-09-29 09:50:17.317746749 +0000 UTC m=+147.522915070" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.319876 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" event={"ID":"863b27cf-013e-4f40-8b0f-3323ee3aa8ad","Type":"ContainerStarted","Data":"ad109b27464e498cd6d42cbd66fbef77cff17b5476c3cd5f00b615deeaa631d1"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.319952 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" event={"ID":"863b27cf-013e-4f40-8b0f-3323ee3aa8ad","Type":"ContainerStarted","Data":"ea737d5a82237277b4225f9a476021615081f63048dd0d06e1c5ce19e080fdb7"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.319965 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" event={"ID":"863b27cf-013e-4f40-8b0f-3323ee3aa8ad","Type":"ContainerStarted","Data":"27440e0d4294ff2cb9ff6eb54341047de20a90dea7219355079ac94d79ab93c1"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.320944 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.329856 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" event={"ID":"0e702ccf-0d4b-4ad0-bb27-16bc39303992","Type":"ContainerStarted","Data":"40c7c73949c8b6102e7df51bca3c7e842b8b32632b3739a860883c57fa457a83"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.361812 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.364119 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" event={"ID":"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d","Type":"ContainerStarted","Data":"f2ea41d2768137aa1700ca90e5c62c74ae5bdf6b6ecf9c073e79f89b131ee05a"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.364174 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" event={"ID":"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d","Type":"ContainerStarted","Data":"e494214f4734ac30b3e9a502057d32618a4bc5130d67e0b64e5d5ea1c70e631b"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.364185 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" event={"ID":"21a1ea5d-e0be-47d8-8ed4-f2e9b040771d","Type":"ContainerStarted","Data":"318957ab60f031980c2ef7aa4d91243545185217f453d149800e3f9d44e68f64"} Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.365304 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.865278293 +0000 UTC m=+148.070446774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.368437 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n6x5w" podStartSLOduration=125.368408254 podStartE2EDuration="2m5.368408254s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.365622713 +0000 UTC m=+147.570791044" watchObservedRunningTime="2025-09-29 09:50:17.368408254 +0000 UTC m=+147.573576595" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.390913 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6lfmx" event={"ID":"a8d02cdd-0b15-456d-ad23-5ff1b81875a6","Type":"ContainerStarted","Data":"12dc8e315ed314578575abdb0cf4c5d5a5d3d114306b2f0c48477cc052927c1c"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.415213 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" podStartSLOduration=126.415192527 podStartE2EDuration="2m6.415192527s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.412454247 +0000 UTC m=+147.617622578" watchObservedRunningTime="2025-09-29 09:50:17.415192527 +0000 UTC m=+147.620360848" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.438662 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" event={"ID":"b5dbaded-b4ab-4c6d-b825-885f87113f5b","Type":"ContainerStarted","Data":"54c2cdafe65755513ad3d4d1bd0eb6f6c8bd0dc6bb80eba30c54d00b6b3264c9"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.438720 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" event={"ID":"b5dbaded-b4ab-4c6d-b825-885f87113f5b","Type":"ContainerStarted","Data":"c6d76a4759663aa7300cc209753789d0ee6e09cd3ae003f752adca1736c59ddc"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.467437 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g6rvx" podStartSLOduration=126.467407827 podStartE2EDuration="2m6.467407827s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.466614724 +0000 UTC m=+147.671783055" watchObservedRunningTime="2025-09-29 09:50:17.467407827 +0000 UTC m=+147.672576148" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.467867 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.469830 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:17.969777086 +0000 UTC m=+148.174945417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.473570 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" event={"ID":"76751bcd-3e42-47b3-bfa8-a89525f681f6","Type":"ContainerStarted","Data":"80342fd3ea2334ddb501ffe84da9bb8e80a43792a05ced9f1e1a059ae1eb7421"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.473637 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" event={"ID":"76751bcd-3e42-47b3-bfa8-a89525f681f6","Type":"ContainerStarted","Data":"525d73e57640adebb396c07611636cfa2621d7b78eccfaa477fabde26522e065"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.479965 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" event={"ID":"c622ed91-dbc3-4b63-bb42-a9fed92bf9ef","Type":"ContainerStarted","Data":"1753f7d81ec614fbebcf1ae037d67765181c1f52222238481279abe3ce66747b"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.508293 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" event={"ID":"588d9386-5914-4ca4-bea2-fcb06e0de4b3","Type":"ContainerStarted","Data":"9d7ec28a1aaccbf39931e95b286fc3c9bc32b027d97999790f59823a45efe175"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.513910 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" event={"ID":"e26e0265-569e-4929-8dd8-8b2665b37f81","Type":"ContainerStarted","Data":"07bba594bc4dc21a3febb861646396586b28f7de4a26432940c2408a95a28df4"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.522662 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" event={"ID":"d102c7ee-0242-4b77-85f4-5ca86e742bf2","Type":"ContainerStarted","Data":"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.523899 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.534597 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" podStartSLOduration=125.534575683 podStartE2EDuration="2m5.534575683s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.531756361 +0000 UTC m=+147.736924702" watchObservedRunningTime="2025-09-29 09:50:17.534575683 +0000 UTC m=+147.739744004" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.534939 4891 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6bzpg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.535027 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.551553 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" event={"ID":"9231f127-f250-40fa-b818-7b01584701d2","Type":"ContainerStarted","Data":"41994bc0a444e87e511d2ef22aa580b48ecd37ca364edf97eea2ea6b47235c4c"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.566940 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" event={"ID":"beebb3f2-0934-4828-afec-cacda471836b","Type":"ContainerStarted","Data":"2f170bb159602085b604eb892a27b40f8348f7ff8144fec7a9b26373a6fc520a"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.567676 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6lfmx" podStartSLOduration=7.5676535860000005 podStartE2EDuration="7.567653586s" podCreationTimestamp="2025-09-29 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.566540374 +0000 UTC m=+147.771708715" watchObservedRunningTime="2025-09-29 09:50:17.567653586 +0000 UTC m=+147.772821907" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.570076 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.571536 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.071521579 +0000 UTC m=+148.276689900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.574284 4891 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pfbzg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.574362 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" podUID="beebb3f2-0934-4828-afec-cacda471836b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.579196 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" event={"ID":"492762a9-9e7d-4095-b2f4-990f58b82d21","Type":"ContainerStarted","Data":"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.579242 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" event={"ID":"492762a9-9e7d-4095-b2f4-990f58b82d21","Type":"ContainerStarted","Data":"09f8e29e1c8bcd1d38ebd19ffcbfdb62e33b53c6edb3854fa30c2ae389511161"} Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.579260 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.586070 4891 patch_prober.go:28] interesting pod/console-operator-58897d9998-jnfml container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.586149 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jnfml" podUID="1afb1091-dfff-4abe-a642-83e586b66b4e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.587862 4891 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-96vcr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.587893 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.638302 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.641730 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g5tqf" podStartSLOduration=125.641708223 podStartE2EDuration="2m5.641708223s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.632812844 +0000 UTC m=+147.837981175" watchObservedRunningTime="2025-09-29 09:50:17.641708223 +0000 UTC m=+147.846876534" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.643475 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.643845 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.670088 4891 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fs2sv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.670748 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" podUID="76751bcd-3e42-47b3-bfa8-a89525f681f6" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.671582 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-54mnr" podStartSLOduration=125.671561462 podStartE2EDuration="2m5.671561462s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.671196412 +0000 UTC m=+147.876364743" watchObservedRunningTime="2025-09-29 09:50:17.671561462 +0000 UTC m=+147.876729783" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.676589 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.709056 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.209032353 +0000 UTC m=+148.414200674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.718601 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" podStartSLOduration=125.718567691 podStartE2EDuration="2m5.718567691s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.711187236 +0000 UTC m=+147.916355567" watchObservedRunningTime="2025-09-29 09:50:17.718567691 +0000 UTC m=+147.923736012" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.779635 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.780111 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.280088313 +0000 UTC m=+148.485256634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.858188 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:17 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:17 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:17 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.858281 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.874577 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.875267 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.881106 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.881683 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.381666231 +0000 UTC m=+148.586834562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.901887 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h4vv2" podStartSLOduration=125.901862049 podStartE2EDuration="2m5.901862049s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:17.897843312 +0000 UTC m=+148.103011643" watchObservedRunningTime="2025-09-29 09:50:17.901862049 +0000 UTC m=+148.107030370" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.923296 4891 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-lzjgr container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.923385 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" podUID="e26e0265-569e-4929-8dd8-8b2665b37f81" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.982709 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.982956 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.482905719 +0000 UTC m=+148.688074040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:17 crc kubenswrapper[4891]: I0929 09:50:17.983065 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:17 crc kubenswrapper[4891]: E0929 09:50:17.983623 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.483607349 +0000 UTC m=+148.688775670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.083952 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.084195 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.584162887 +0000 UTC m=+148.789331208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.084343 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.084732 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.584715343 +0000 UTC m=+148.789883654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.112717 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lkthp" podStartSLOduration=126.112697948 podStartE2EDuration="2m6.112697948s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.092340545 +0000 UTC m=+148.297508876" watchObservedRunningTime="2025-09-29 09:50:18.112697948 +0000 UTC m=+148.317866269" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.185455 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.186265 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.6859086 +0000 UTC m=+148.891076921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.199595 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" podStartSLOduration=126.199571218 podStartE2EDuration="2m6.199571218s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.198156457 +0000 UTC m=+148.403324788" watchObservedRunningTime="2025-09-29 09:50:18.199571218 +0000 UTC m=+148.404739539" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.288222 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.288616 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.78859995 +0000 UTC m=+148.993768271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.389052 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.391651 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.891618 +0000 UTC m=+149.096786331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.424657 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4nmlj" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.493759 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.494285 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:18.994266018 +0000 UTC m=+149.199434349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.586924 4891 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cd8dx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.587010 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.587257 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" event={"ID":"44c6fbd1-b182-4c7a-867a-6f310ed97fdb","Type":"ContainerStarted","Data":"709878b51a308a18e4d4b34703ddbd497515c360823334837908dd1dca05c9c0"} Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.595059 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.595697 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.095668731 +0000 UTC m=+149.300837052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.597835 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" podStartSLOduration=126.597810864 podStartE2EDuration="2m6.597810864s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.466495031 +0000 UTC m=+148.671663372" watchObservedRunningTime="2025-09-29 09:50:18.597810864 +0000 UTC m=+148.802979205" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.616279 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzzff" event={"ID":"71c66a9c-1955-4f9e-8b23-7bd7361b8550","Type":"ContainerStarted","Data":"6d8cdbdb5014e4ee3433a0a6d4cb435e9dd231e357431bb70605a07cd2b8f219"} Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.619725 4891 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9nk5w container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.619822 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" podUID="d8e4b574-6c94-42fe-9e04-651a2f252e8e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.637127 4891 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-96vcr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.640213 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.697546 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.698965 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfbzg" Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.708494 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.208455376 +0000 UTC m=+149.413623697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.710994 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" podStartSLOduration=126.710973919 podStartE2EDuration="2m6.710973919s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.695656133 +0000 UTC m=+148.900824464" watchObservedRunningTime="2025-09-29 09:50:18.710973919 +0000 UTC m=+148.916142260" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.715114 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jf4xk" podStartSLOduration=126.715095629 podStartE2EDuration="2m6.715095629s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.601246354 +0000 UTC m=+148.806414695" watchObservedRunningTime="2025-09-29 09:50:18.715095629 +0000 UTC m=+148.920263950" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.811127 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.811642 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.31162378 +0000 UTC m=+149.516792101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.863094 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:18 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:18 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:18 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.863171 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.914263 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mzzff" podStartSLOduration=8.914215537 podStartE2EDuration="8.914215537s" podCreationTimestamp="2025-09-29 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.865453667 +0000 UTC m=+149.070621998" watchObservedRunningTime="2025-09-29 09:50:18.914215537 +0000 UTC m=+149.119383858" Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.916100 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:18 crc kubenswrapper[4891]: E0929 09:50:18.916763 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.416737001 +0000 UTC m=+149.621905322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:18 crc kubenswrapper[4891]: I0929 09:50:18.978761 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9t6nh" podStartSLOduration=126.97784879 podStartE2EDuration="2m6.97784879s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:18.977213832 +0000 UTC m=+149.182382163" watchObservedRunningTime="2025-09-29 09:50:18.97784879 +0000 UTC m=+149.183017121" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.017748 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.018408 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.518379591 +0000 UTC m=+149.723547912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.120181 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.120829 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.620784053 +0000 UTC m=+149.825952544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.132692 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.222321 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.722298989 +0000 UTC m=+149.927467310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.222364 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.222637 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.223007 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.722998529 +0000 UTC m=+149.928166840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.324277 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.324664 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.324712 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.324756 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.324813 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.326320 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.826293617 +0000 UTC m=+150.031461938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.330914 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.336332 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.336965 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.360383 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.421365 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.426174 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.426531 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:19.926519546 +0000 UTC m=+150.131687857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.429090 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.435885 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.527084 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.527257 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.027227128 +0000 UTC m=+150.232395449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.527721 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.528174 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.028161126 +0000 UTC m=+150.233329447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.624545 4891 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-cc2bw container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.624630 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" podUID="0971f343-a162-4db1-96bb-3857bd667ad2" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.628924 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.629113 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.129083854 +0000 UTC m=+150.334252175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.629279 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.629868 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.129846327 +0000 UTC m=+150.335014658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.631079 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.696270 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9nk5w" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.730222 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.732565 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.232540607 +0000 UTC m=+150.437708928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.845949 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.846479 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.346460475 +0000 UTC m=+150.551628796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.861207 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:19 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:19 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:19 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.861305 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.910404 4891 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-cc2bw container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.910512 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" podUID="0971f343-a162-4db1-96bb-3857bd667ad2" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 09:50:19 crc kubenswrapper[4891]: I0929 09:50:19.948705 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:19 crc kubenswrapper[4891]: E0929 09:50:19.949651 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.449628279 +0000 UTC m=+150.654796600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.004378 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cc2bw" Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.052937 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.053363 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.553349949 +0000 UTC m=+150.758518270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.154639 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.155577 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.655524404 +0000 UTC m=+150.860692725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.163894 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.164362 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.664345571 +0000 UTC m=+150.869513892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.265090 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.265574 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.765547978 +0000 UTC m=+150.970716299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.372522 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.373078 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.873054229 +0000 UTC m=+151.078222730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.473768 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.474083 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.974032109 +0000 UTC m=+151.179200450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.474749 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.475347 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:20.975328877 +0000 UTC m=+151.180497198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.601234 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.601847 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.101825151 +0000 UTC m=+151.306993472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.688804 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0be81be0ebf278694176cd689b8e6eb182c2bea19f0f8b1593897ab6cf6857e8"} Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.704693 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.705282 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.205262243 +0000 UTC m=+151.410430564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.808460 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.811767 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.311733143 +0000 UTC m=+151.516901464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.865174 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:20 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:20 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:20 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.865761 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.910780 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:20 crc kubenswrapper[4891]: E0929 09:50:20.911203 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.411188849 +0000 UTC m=+151.616357170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.935973 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.937403 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.947053 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 09:50:20 crc kubenswrapper[4891]: I0929 09:50:20.951579 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.015024 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.015367 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6jj\" (UniqueName: \"kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.015404 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.015492 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.015589 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.515576119 +0000 UTC m=+151.720744440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.116966 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.117018 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6jj\" (UniqueName: \"kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.117053 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.117090 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.117449 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.617434345 +0000 UTC m=+151.822602666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.118105 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.118443 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.136093 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.137238 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.141435 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.195897 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6jj\" (UniqueName: \"kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj\") pod \"certified-operators-9n7fc\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.226676 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.227029 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.227160 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.227208 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn26q\" (UniqueName: \"kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.227388 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.727356696 +0000 UTC m=+151.932525017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.241895 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.291182 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.330240 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.330335 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.330358 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.330384 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn26q\" (UniqueName: \"kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.331525 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.331958 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.831946372 +0000 UTC m=+152.037114683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.332326 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.349322 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.350454 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.369314 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn26q\" (UniqueName: \"kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q\") pod \"community-operators-p4zqs\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.377441 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.430983 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.431359 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqfd\" (UniqueName: \"kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.431435 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.431479 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.431578 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:21.931563393 +0000 UTC m=+152.136731704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.490375 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.524889 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.549672 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqfd\" (UniqueName: \"kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.549766 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.549869 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.549903 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.551319 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.551586 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.556037 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.551606 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.051590888 +0000 UTC m=+152.256759209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.557809 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.600828 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqfd\" (UniqueName: \"kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd\") pod \"certified-operators-vnw4z\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.637138 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.655489 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.655764 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.655854 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zf6d\" (UniqueName: \"kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.655876 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.656004 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.155989118 +0000 UTC m=+152.361157439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.698011 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"23617b1695ccabffdc9a991a6b06138b7933814fab929c3cd01b462f60e068ef"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.698443 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"16ed15c786fb7ec07e7498aa2af41eb211cb03b69e29a64927a37f6aba5bc13b"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.699647 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.703440 4891 generic.go:334] "Generic (PLEG): container finished" podID="443c2a5c-8366-4170-80e7-063687c1caaf" containerID="77cbba8bbf10cd7adc0d37679db0ce0c3b360ceb2e5f3f1efa0147177f7e80f7" exitCode=0 Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.703522 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" event={"ID":"443c2a5c-8366-4170-80e7-063687c1caaf","Type":"ContainerDied","Data":"77cbba8bbf10cd7adc0d37679db0ce0c3b360ceb2e5f3f1efa0147177f7e80f7"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.706594 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" event={"ID":"44c6fbd1-b182-4c7a-867a-6f310ed97fdb","Type":"ContainerStarted","Data":"b47cbcb7cfaaa26864f89db843ead55d55aca3af86a3a7a173668f2c95218577"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.710843 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"829cdabf5e9ab0525a91f7a97da3bc167deaef7a78dcb5a3e815604d613d1eda"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.745123 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9390642ef9fc99b16a16c568ff348649d474005288fc6772e48fba112860ea9e"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.745167 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6f64b02d8033fec3691ecaf552f31e2313f2699da0de18f3ab5526cb20ed22f0"} Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.771044 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zf6d\" (UniqueName: \"kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.771094 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.771123 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.771173 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.772977 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.272961214 +0000 UTC m=+152.478129535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.773564 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.773614 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.855957 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zf6d\" (UniqueName: \"kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d\") pod \"community-operators-fqbbz\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.860407 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:21 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:21 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:21 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.860492 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.877337 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.877620 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.377590821 +0000 UTC m=+152.582759142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.877783 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.879658 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.379641971 +0000 UTC m=+152.584810362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.949159 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:50:21 crc kubenswrapper[4891]: I0929 09:50:21.981590 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:21 crc kubenswrapper[4891]: E0929 09:50:21.982136 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.482107965 +0000 UTC m=+152.687276296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.045292 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.046558 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.066644 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.067094 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.088812 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.089144 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.58912999 +0000 UTC m=+152.794298311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.105957 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.189592 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.189998 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.689948996 +0000 UTC m=+152.895117317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.196168 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.196392 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.196455 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.196633 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.69661004 +0000 UTC m=+152.901778431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.221613 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.300938 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.301271 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.301315 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.301558 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.301716 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.80169565 +0000 UTC m=+153.006863991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.345526 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.347476 4891 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.402888 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.403863 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:22.903849745 +0000 UTC m=+153.109018056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.408142 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.488336 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.507183 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.507574 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:50:23.007545334 +0000 UTC m=+153.212713655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.507865 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.511981 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.516264 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.517770 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.524859 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.589374 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.605885 4891 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-29T09:50:22.347501954Z","Handler":null,"Name":""} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.608489 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.608717 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.608975 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: E0929 09:50:22.609610 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:50:23.109570945 +0000 UTC m=+153.314739266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mgnzm" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.627379 4891 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.627424 4891 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.668482 4891 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fs2sv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]log ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]etcd ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/generic-apiserver-start-informers ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/max-in-flight-filter ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 29 09:50:22 crc kubenswrapper[4891]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/project.openshift.io-projectcache ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/openshift.io-startinformers ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 29 09:50:22 crc kubenswrapper[4891]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 29 09:50:22 crc kubenswrapper[4891]: livez check failed Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.668553 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" podUID="76751bcd-3e42-47b3-bfa8-a89525f681f6" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.710414 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.711054 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.711155 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.711294 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.722097 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.723009 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.739282 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.759333 4891 patch_prober.go:28] interesting pod/downloads-7954f5f757-l9jdp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.759406 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-l9jdp" podUID="303b949b-a531-46fa-a69d-6cc909009fc4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.759706 4891 patch_prober.go:28] interesting pod/downloads-7954f5f757-l9jdp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.759728 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l9jdp" podUID="303b949b-a531-46fa-a69d-6cc909009fc4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.760638 4891 generic.go:334] "Generic (PLEG): container finished" podID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerID="98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235" exitCode=0 Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.761500 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerDied","Data":"98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.761537 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerStarted","Data":"f017faa7182887872ba59f760fa41e7a0aa26e50828348dba4f55a6dabdadcc8"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.763214 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.764655 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerStarted","Data":"621dc79f0afdfaed5938cc73f6afa0d5dda9f9e22c6a923222d00bb76a780bb6"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.775070 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerStarted","Data":"39a12d4fdf585cea2abc1696e213dc3bfcb396881393624e503ec7a2b3d4277f"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.784698 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.786114 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.792099 4891 patch_prober.go:28] interesting pod/console-f9d7485db-4nznm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.792168 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4nznm" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.793920 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" event={"ID":"44c6fbd1-b182-4c7a-867a-6f310ed97fdb","Type":"ContainerStarted","Data":"db3464ee14cdcad1c86975da75b2399fb3c59df2e7306800e05a7f8217e30d49"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.794070 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" event={"ID":"44c6fbd1-b182-4c7a-867a-6f310ed97fdb","Type":"ContainerStarted","Data":"f20a96ed722abea3768916fe9191396fca2b90a61078a573c594899eacd1d5ae"} Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.810994 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.816389 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.845980 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.850364 4891 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.850408 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.851985 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:22 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:22 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:22 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.852065 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:22 crc kubenswrapper[4891]: W0929 09:50:22.864367 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf87cee97_3326_4334_95e7_da15db1dbf12.slice/crio-f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6 WatchSource:0}: Error finding container f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6: Status 404 returned error can't find the container with id f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6 Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.881196 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.887624 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.908480 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mgnzm\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.912551 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lzjgr" Sep 29 09:50:22 crc kubenswrapper[4891]: I0929 09:50:22.917581 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9nwpf" podStartSLOduration=12.917554824 podStartE2EDuration="12.917554824s" podCreationTimestamp="2025-09-29 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:22.82057345 +0000 UTC m=+153.025741791" watchObservedRunningTime="2025-09-29 09:50:22.917554824 +0000 UTC m=+153.122723145" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.133542 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.133807 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.140343 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.148364 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.151891 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.230094 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.230157 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.230228 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wl6j\" (UniqueName: \"kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.262014 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.331553 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume\") pod \"443c2a5c-8366-4170-80e7-063687c1caaf\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.331700 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume\") pod \"443c2a5c-8366-4170-80e7-063687c1caaf\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.331762 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbdpj\" (UniqueName: \"kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj\") pod \"443c2a5c-8366-4170-80e7-063687c1caaf\" (UID: \"443c2a5c-8366-4170-80e7-063687c1caaf\") " Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.332050 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wl6j\" (UniqueName: \"kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.332131 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.332172 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.334503 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.336170 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume" (OuterVolumeSpecName: "config-volume") pod "443c2a5c-8366-4170-80e7-063687c1caaf" (UID: "443c2a5c-8366-4170-80e7-063687c1caaf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.336693 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.343694 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj" (OuterVolumeSpecName: "kube-api-access-gbdpj") pod "443c2a5c-8366-4170-80e7-063687c1caaf" (UID: "443c2a5c-8366-4170-80e7-063687c1caaf"). InnerVolumeSpecName "kube-api-access-gbdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.351984 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "443c2a5c-8366-4170-80e7-063687c1caaf" (UID: "443c2a5c-8366-4170-80e7-063687c1caaf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.392562 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wl6j\" (UniqueName: \"kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j\") pod \"redhat-marketplace-4tsjt\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.423742 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.435107 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbdpj\" (UniqueName: \"kubernetes.io/projected/443c2a5c-8366-4170-80e7-063687c1caaf-kube-api-access-gbdpj\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.435143 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/443c2a5c-8366-4170-80e7-063687c1caaf-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.435159 4891 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/443c2a5c-8366-4170-80e7-063687c1caaf-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.472937 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.528848 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:50:23 crc kubenswrapper[4891]: E0929 09:50:23.529116 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443c2a5c-8366-4170-80e7-063687c1caaf" containerName="collect-profiles" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.529132 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="443c2a5c-8366-4170-80e7-063687c1caaf" containerName="collect-profiles" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.529264 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="443c2a5c-8366-4170-80e7-063687c1caaf" containerName="collect-profiles" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.530181 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.542398 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.583914 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.593179 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jnfml" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.639214 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6xb\" (UniqueName: \"kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.639756 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.639811 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.649414 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.747227 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.747291 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.747326 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6xb\" (UniqueName: \"kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.748772 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.749617 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.791058 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6xb\" (UniqueName: \"kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb\") pod \"redhat-marketplace-c9qxq\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.828901 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a498a4fe-29bb-4868-80e7-2576fd472ac8","Type":"ContainerStarted","Data":"075975a5ca22b922eed569a486660db9be45edfeac3b9958eb587a72ad38a52e"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.836194 4891 generic.go:334] "Generic (PLEG): container finished" podID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerID="f558300f7aebf05a5e20e043703bfc9cd648d29cb1554342a8eee588fe07203b" exitCode=0 Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.836341 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerDied","Data":"f558300f7aebf05a5e20e043703bfc9cd648d29cb1554342a8eee588fe07203b"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.857630 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.857682 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh" event={"ID":"443c2a5c-8366-4170-80e7-063687c1caaf","Type":"ContainerDied","Data":"53e88b0c1ab6bfe91e8a6601b18159904f16e572e1156a1e4a7738ce814db722"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.857745 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53e88b0c1ab6bfe91e8a6601b18159904f16e572e1156a1e4a7738ce814db722" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.863378 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:23 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:23 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:23 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.863435 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.873277 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" event={"ID":"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d","Type":"ContainerStarted","Data":"d4c7af6e94172f4a4fe8530aba7b6469829f2db8797abe7fd5eafc038b08193d"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.875335 4891 generic.go:334] "Generic (PLEG): container finished" podID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerID="d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f" exitCode=0 Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.875401 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerDied","Data":"d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.876960 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.882699 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f87cee97-3326-4334-95e7-da15db1dbf12","Type":"ContainerStarted","Data":"dc0655d78cc54f9a5dc6ed3318d59086122a8884f8aed6527855b9d64bc5fba2"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.882756 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f87cee97-3326-4334-95e7-da15db1dbf12","Type":"ContainerStarted","Data":"f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.899660 4891 generic.go:334] "Generic (PLEG): container finished" podID="564b206f-9094-4670-a3e0-2293b94fe724" containerID="b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824" exitCode=0 Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.899919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerDied","Data":"b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.899983 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerStarted","Data":"f9f61da44f31b4f0b819428599663d46d56d6717175dffd77f348546ce168b2a"} Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.953210 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:50:23 crc kubenswrapper[4891]: I0929 09:50:23.953267 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.953252094 podStartE2EDuration="1.953252094s" podCreationTimestamp="2025-09-29 09:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:23.952213003 +0000 UTC m=+154.157381334" watchObservedRunningTime="2025-09-29 09:50:23.953252094 +0000 UTC m=+154.158420435" Sep 29 09:50:23 crc kubenswrapper[4891]: W0929 09:50:23.966718 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6d0237_b356_48d2_bc1b_e28b47089506.slice/crio-784ce309adcd765ed30ee43cd2bc1ecdb452720763e1af264269d706d71b6342 WatchSource:0}: Error finding container 784ce309adcd765ed30ee43cd2bc1ecdb452720763e1af264269d706d71b6342: Status 404 returned error can't find the container with id 784ce309adcd765ed30ee43cd2bc1ecdb452720763e1af264269d706d71b6342 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.013309 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.114604 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.116546 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.119832 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.131088 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.183264 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:50:24 crc kubenswrapper[4891]: W0929 09:50:24.216288 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72906240_c8d2_48b3_b36b_1c761b33c1ec.slice/crio-9844dcf8920159f0a3c9cf6b08fe2923717dab7ecb70dcf3adbea5d2725e2a7f WatchSource:0}: Error finding container 9844dcf8920159f0a3c9cf6b08fe2923717dab7ecb70dcf3adbea5d2725e2a7f: Status 404 returned error can't find the container with id 9844dcf8920159f0a3c9cf6b08fe2923717dab7ecb70dcf3adbea5d2725e2a7f Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.260434 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.260529 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.260607 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9mf\" (UniqueName: \"kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.362015 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.362112 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.362206 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9mf\" (UniqueName: \"kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.363411 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.363702 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.389400 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9mf\" (UniqueName: \"kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf\") pod \"redhat-operators-lpncm\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.408127 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.442809 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.509761 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.511353 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.523968 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.665534 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.666246 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6sr5\" (UniqueName: \"kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.666298 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.768192 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.768291 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6sr5\" (UniqueName: \"kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.768326 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.768876 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.768984 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.783306 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:50:24 crc kubenswrapper[4891]: W0929 09:50:24.794446 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8331db_bf12_47e7_80c8_abd1d766b214.slice/crio-c488fa2d92713076a03bca6c42e9c37b947bd838f1a942ddc0aa283f331c9713 WatchSource:0}: Error finding container c488fa2d92713076a03bca6c42e9c37b947bd838f1a942ddc0aa283f331c9713: Status 404 returned error can't find the container with id c488fa2d92713076a03bca6c42e9c37b947bd838f1a942ddc0aa283f331c9713 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.804964 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6sr5\" (UniqueName: \"kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5\") pod \"redhat-operators-flclg\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.859240 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.860289 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:24 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:24 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:24 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.860479 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.939865 4891 generic.go:334] "Generic (PLEG): container finished" podID="f87cee97-3326-4334-95e7-da15db1dbf12" containerID="dc0655d78cc54f9a5dc6ed3318d59086122a8884f8aed6527855b9d64bc5fba2" exitCode=0 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.939978 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f87cee97-3326-4334-95e7-da15db1dbf12","Type":"ContainerDied","Data":"dc0655d78cc54f9a5dc6ed3318d59086122a8884f8aed6527855b9d64bc5fba2"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.943608 4891 generic.go:334] "Generic (PLEG): container finished" podID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerID="0dd43d35974204b72dbd98c30b859c8e8c001a14cd8351636cc3e4a814e125dc" exitCode=0 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.943739 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerDied","Data":"0dd43d35974204b72dbd98c30b859c8e8c001a14cd8351636cc3e4a814e125dc"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.943782 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerStarted","Data":"9844dcf8920159f0a3c9cf6b08fe2923717dab7ecb70dcf3adbea5d2725e2a7f"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.957182 4891 generic.go:334] "Generic (PLEG): container finished" podID="a498a4fe-29bb-4868-80e7-2576fd472ac8" containerID="7a5f32be0d2d9b91702d7a6aead9914dd731911dc7ce7675a6adeb6e6a39e2c1" exitCode=0 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.957297 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a498a4fe-29bb-4868-80e7-2576fd472ac8","Type":"ContainerDied","Data":"7a5f32be0d2d9b91702d7a6aead9914dd731911dc7ce7675a6adeb6e6a39e2c1"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.964148 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" event={"ID":"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d","Type":"ContainerStarted","Data":"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.964232 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.967088 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerStarted","Data":"c488fa2d92713076a03bca6c42e9c37b947bd838f1a942ddc0aa283f331c9713"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.985847 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerID="7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343" exitCode=0 Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.985976 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerDied","Data":"7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343"} Sep 29 09:50:24 crc kubenswrapper[4891]: I0929 09:50:24.986061 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerStarted","Data":"784ce309adcd765ed30ee43cd2bc1ecdb452720763e1af264269d706d71b6342"} Sep 29 09:50:25 crc kubenswrapper[4891]: I0929 09:50:25.021905 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" podStartSLOduration=133.021888072 podStartE2EDuration="2m13.021888072s" podCreationTimestamp="2025-09-29 09:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:50:25.02009583 +0000 UTC m=+155.225264161" watchObservedRunningTime="2025-09-29 09:50:25.021888072 +0000 UTC m=+155.227056393" Sep 29 09:50:25 crc kubenswrapper[4891]: I0929 09:50:25.155224 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:50:25 crc kubenswrapper[4891]: W0929 09:50:25.169558 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44276eb_5210_4810_a1ae_eb6a08958f12.slice/crio-6aea964ece92ef99654368046e9ee1b164f8bae1f61f44dd245550613d20c799 WatchSource:0}: Error finding container 6aea964ece92ef99654368046e9ee1b164f8bae1f61f44dd245550613d20c799: Status 404 returned error can't find the container with id 6aea964ece92ef99654368046e9ee1b164f8bae1f61f44dd245550613d20c799 Sep 29 09:50:25 crc kubenswrapper[4891]: I0929 09:50:25.851761 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:25 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:25 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:25 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:25 crc kubenswrapper[4891]: I0929 09:50:25.851860 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.016091 4891 generic.go:334] "Generic (PLEG): container finished" podID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerID="bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428" exitCode=0 Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.016158 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerDied","Data":"bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428"} Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.016186 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerStarted","Data":"6aea964ece92ef99654368046e9ee1b164f8bae1f61f44dd245550613d20c799"} Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.042675 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerID="eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5" exitCode=0 Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.044078 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerDied","Data":"eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5"} Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.390342 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.509456 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access\") pod \"f87cee97-3326-4334-95e7-da15db1dbf12\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.509914 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir\") pod \"f87cee97-3326-4334-95e7-da15db1dbf12\" (UID: \"f87cee97-3326-4334-95e7-da15db1dbf12\") " Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.510354 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f87cee97-3326-4334-95e7-da15db1dbf12" (UID: "f87cee97-3326-4334-95e7-da15db1dbf12"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.521905 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.534241 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f87cee97-3326-4334-95e7-da15db1dbf12" (UID: "f87cee97-3326-4334-95e7-da15db1dbf12"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.611779 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f87cee97-3326-4334-95e7-da15db1dbf12-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.611896 4891 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f87cee97-3326-4334-95e7-da15db1dbf12-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.712512 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir\") pod \"a498a4fe-29bb-4868-80e7-2576fd472ac8\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.712626 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a498a4fe-29bb-4868-80e7-2576fd472ac8" (UID: "a498a4fe-29bb-4868-80e7-2576fd472ac8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.712690 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access\") pod \"a498a4fe-29bb-4868-80e7-2576fd472ac8\" (UID: \"a498a4fe-29bb-4868-80e7-2576fd472ac8\") " Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.712997 4891 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a498a4fe-29bb-4868-80e7-2576fd472ac8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.719606 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a498a4fe-29bb-4868-80e7-2576fd472ac8" (UID: "a498a4fe-29bb-4868-80e7-2576fd472ac8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.814628 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a498a4fe-29bb-4868-80e7-2576fd472ac8-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.850181 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:26 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:26 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:26 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:26 crc kubenswrapper[4891]: I0929 09:50:26.850252 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.059878 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f87cee97-3326-4334-95e7-da15db1dbf12","Type":"ContainerDied","Data":"f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6"} Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.059969 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08ed0701a7a7e0989f8eebedf673d4aaaf8d60137e393b909e01dc067a93ce6" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.059991 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.062563 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a498a4fe-29bb-4868-80e7-2576fd472ac8","Type":"ContainerDied","Data":"075975a5ca22b922eed569a486660db9be45edfeac3b9958eb587a72ad38a52e"} Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.062632 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="075975a5ca22b922eed569a486660db9be45edfeac3b9958eb587a72ad38a52e" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.062752 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.647334 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.668104 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fs2sv" Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.854386 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:27 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:27 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:27 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:27 crc kubenswrapper[4891]: I0929 09:50:27.854715 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:28 crc kubenswrapper[4891]: I0929 09:50:28.849078 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:28 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:28 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:28 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:28 crc kubenswrapper[4891]: I0929 09:50:28.849148 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:29 crc kubenswrapper[4891]: I0929 09:50:29.055374 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mzzff" Sep 29 09:50:29 crc kubenswrapper[4891]: I0929 09:50:29.859269 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:29 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:29 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:29 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:29 crc kubenswrapper[4891]: I0929 09:50:29.859780 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:30 crc kubenswrapper[4891]: I0929 09:50:30.852629 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:30 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:30 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:30 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:30 crc kubenswrapper[4891]: I0929 09:50:30.852744 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:31 crc kubenswrapper[4891]: I0929 09:50:31.848166 4891 patch_prober.go:28] interesting pod/router-default-5444994796-28nrn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:50:31 crc kubenswrapper[4891]: [-]has-synced failed: reason withheld Sep 29 09:50:31 crc kubenswrapper[4891]: [+]process-running ok Sep 29 09:50:31 crc kubenswrapper[4891]: healthz check failed Sep 29 09:50:31 crc kubenswrapper[4891]: I0929 09:50:31.848322 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28nrn" podUID="e320dc35-e65d-489f-b752-da6f9eda884f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:50:32 crc kubenswrapper[4891]: I0929 09:50:32.771323 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-l9jdp" Sep 29 09:50:32 crc kubenswrapper[4891]: I0929 09:50:32.783362 4891 patch_prober.go:28] interesting pod/console-f9d7485db-4nznm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Sep 29 09:50:32 crc kubenswrapper[4891]: I0929 09:50:32.783470 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4nznm" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Sep 29 09:50:32 crc kubenswrapper[4891]: I0929 09:50:32.849511 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:32 crc kubenswrapper[4891]: I0929 09:50:32.852598 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-28nrn" Sep 29 09:50:33 crc kubenswrapper[4891]: I0929 09:50:33.968215 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:33 crc kubenswrapper[4891]: I0929 09:50:33.981989 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45417d1e-e3f1-4cc9-9f51-65affc9d72f6-metrics-certs\") pod \"network-metrics-daemon-6thmw\" (UID: \"45417d1e-e3f1-4cc9-9f51-65affc9d72f6\") " pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:34 crc kubenswrapper[4891]: I0929 09:50:34.011404 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6thmw" Sep 29 09:50:36 crc kubenswrapper[4891]: I0929 09:50:36.186052 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:50:36 crc kubenswrapper[4891]: I0929 09:50:36.186453 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:50:42 crc kubenswrapper[4891]: I0929 09:50:42.787203 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:42 crc kubenswrapper[4891]: I0929 09:50:42.794050 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:50:43 crc kubenswrapper[4891]: I0929 09:50:43.138209 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:50:54 crc kubenswrapper[4891]: I0929 09:50:54.045094 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm7n" Sep 29 09:50:59 crc kubenswrapper[4891]: I0929 09:50:59.738484 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:51:00 crc kubenswrapper[4891]: E0929 09:51:00.152840 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 29 09:51:00 crc kubenswrapper[4891]: E0929 09:51:00.153047 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zf6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fqbbz_openshift-marketplace(564b206f-9094-4670-a3e0-2293b94fe724): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:00 crc kubenswrapper[4891]: E0929 09:51:00.154272 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fqbbz" podUID="564b206f-9094-4670-a3e0-2293b94fe724" Sep 29 09:51:06 crc kubenswrapper[4891]: I0929 09:51:06.186299 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:51:06 crc kubenswrapper[4891]: I0929 09:51:06.186371 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:51:07 crc kubenswrapper[4891]: E0929 09:51:07.176288 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fqbbz" podUID="564b206f-9094-4670-a3e0-2293b94fe724" Sep 29 09:51:07 crc kubenswrapper[4891]: E0929 09:51:07.230660 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 29 09:51:07 crc kubenswrapper[4891]: E0929 09:51:07.230879 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kj6xb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c9qxq_openshift-marketplace(72906240-c8d2-48b3-b36b-1c761b33c1ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:07 crc kubenswrapper[4891]: E0929 09:51:07.232072 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c9qxq" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.315633 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c9qxq" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.399375 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.399550 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wl6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4tsjt_openshift-marketplace(ea6d0237-b356-48d2-bc1b-e28b47089506): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.403799 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4tsjt" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.404170 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.404335 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gqfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vnw4z_openshift-marketplace(cf8f9aa8-4899-46e0-bf4f-c614dcd05804): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.405462 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vnw4z" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.425129 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.425830 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gq6jj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9n7fc_openshift-marketplace(b457bc1b-81dc-4e12-a955-7cf02a8a03e6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.427077 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9n7fc" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.435051 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.435231 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn26q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p4zqs_openshift-marketplace(255a721a-d660-405d-a4d4-9f1cfdc6bb76): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.436642 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p4zqs" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.490158 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.490294 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rp9mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lpncm_openshift-marketplace(ea8331db-bf12-47e7-80c8-abd1d766b214): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:51:08 crc kubenswrapper[4891]: E0929 09:51:08.491760 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lpncm" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" Sep 29 09:51:08 crc kubenswrapper[4891]: W0929 09:51:08.749107 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45417d1e_e3f1_4cc9_9f51_65affc9d72f6.slice/crio-f81c83df88e2a23df81522d560af014218b3da3c302974257426ee19a78ccad9 WatchSource:0}: Error finding container f81c83df88e2a23df81522d560af014218b3da3c302974257426ee19a78ccad9: Status 404 returned error can't find the container with id f81c83df88e2a23df81522d560af014218b3da3c302974257426ee19a78ccad9 Sep 29 09:51:08 crc kubenswrapper[4891]: I0929 09:51:08.751637 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6thmw"] Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.389520 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6thmw" event={"ID":"45417d1e-e3f1-4cc9-9f51-65affc9d72f6","Type":"ContainerStarted","Data":"e6e2ab0259413636d4374f137029a41eee3fc82efeeed8905c7ce0aa9f75198a"} Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.389897 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6thmw" event={"ID":"45417d1e-e3f1-4cc9-9f51-65affc9d72f6","Type":"ContainerStarted","Data":"8df23a02704b95cfc089866796d0658b671abdd6c6311fdbba413b5b760882dc"} Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.389911 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6thmw" event={"ID":"45417d1e-e3f1-4cc9-9f51-65affc9d72f6","Type":"ContainerStarted","Data":"f81c83df88e2a23df81522d560af014218b3da3c302974257426ee19a78ccad9"} Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.392395 4891 generic.go:334] "Generic (PLEG): container finished" podID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerID="3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159" exitCode=0 Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.392676 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerDied","Data":"3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159"} Sep 29 09:51:09 crc kubenswrapper[4891]: E0929 09:51:09.394672 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p4zqs" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" Sep 29 09:51:09 crc kubenswrapper[4891]: E0929 09:51:09.394705 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vnw4z" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" Sep 29 09:51:09 crc kubenswrapper[4891]: E0929 09:51:09.394761 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9n7fc" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" Sep 29 09:51:09 crc kubenswrapper[4891]: E0929 09:51:09.395540 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4tsjt" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" Sep 29 09:51:09 crc kubenswrapper[4891]: E0929 09:51:09.396380 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lpncm" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" Sep 29 09:51:09 crc kubenswrapper[4891]: I0929 09:51:09.406082 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6thmw" podStartSLOduration=178.406059472 podStartE2EDuration="2m58.406059472s" podCreationTimestamp="2025-09-29 09:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:51:09.405701281 +0000 UTC m=+199.610869602" watchObservedRunningTime="2025-09-29 09:51:09.406059472 +0000 UTC m=+199.611227793" Sep 29 09:51:10 crc kubenswrapper[4891]: I0929 09:51:10.402576 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerStarted","Data":"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e"} Sep 29 09:51:10 crc kubenswrapper[4891]: I0929 09:51:10.421881 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-flclg" podStartSLOduration=2.563064187 podStartE2EDuration="46.421845814s" podCreationTimestamp="2025-09-29 09:50:24 +0000 UTC" firstStartedPulling="2025-09-29 09:50:26.020295595 +0000 UTC m=+156.225463916" lastFinishedPulling="2025-09-29 09:51:09.879077222 +0000 UTC m=+200.084245543" observedRunningTime="2025-09-29 09:51:10.419429072 +0000 UTC m=+200.624597393" watchObservedRunningTime="2025-09-29 09:51:10.421845814 +0000 UTC m=+200.627014135" Sep 29 09:51:14 crc kubenswrapper[4891]: I0929 09:51:14.860230 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:14 crc kubenswrapper[4891]: I0929 09:51:14.861621 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:15 crc kubenswrapper[4891]: I0929 09:51:15.808892 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:15 crc kubenswrapper[4891]: I0929 09:51:15.851012 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:16 crc kubenswrapper[4891]: I0929 09:51:16.038331 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.439840 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-flclg" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="registry-server" containerID="cri-o://3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e" gracePeriod=2 Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.802173 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.994375 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6sr5\" (UniqueName: \"kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5\") pod \"c44276eb-5210-4810-a1ae-eb6a08958f12\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.994492 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities\") pod \"c44276eb-5210-4810-a1ae-eb6a08958f12\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.994533 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content\") pod \"c44276eb-5210-4810-a1ae-eb6a08958f12\" (UID: \"c44276eb-5210-4810-a1ae-eb6a08958f12\") " Sep 29 09:51:17 crc kubenswrapper[4891]: I0929 09:51:17.996692 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities" (OuterVolumeSpecName: "utilities") pod "c44276eb-5210-4810-a1ae-eb6a08958f12" (UID: "c44276eb-5210-4810-a1ae-eb6a08958f12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.016444 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5" (OuterVolumeSpecName: "kube-api-access-r6sr5") pod "c44276eb-5210-4810-a1ae-eb6a08958f12" (UID: "c44276eb-5210-4810-a1ae-eb6a08958f12"). InnerVolumeSpecName "kube-api-access-r6sr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.096625 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6sr5\" (UniqueName: \"kubernetes.io/projected/c44276eb-5210-4810-a1ae-eb6a08958f12-kube-api-access-r6sr5\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.096677 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.101006 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c44276eb-5210-4810-a1ae-eb6a08958f12" (UID: "c44276eb-5210-4810-a1ae-eb6a08958f12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.198270 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44276eb-5210-4810-a1ae-eb6a08958f12-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.448765 4891 generic.go:334] "Generic (PLEG): container finished" podID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerID="3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e" exitCode=0 Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.448830 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerDied","Data":"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e"} Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.448942 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flclg" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.448964 4891 scope.go:117] "RemoveContainer" containerID="3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.448948 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flclg" event={"ID":"c44276eb-5210-4810-a1ae-eb6a08958f12","Type":"ContainerDied","Data":"6aea964ece92ef99654368046e9ee1b164f8bae1f61f44dd245550613d20c799"} Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.467744 4891 scope.go:117] "RemoveContainer" containerID="3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.472511 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.475756 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-flclg"] Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.484230 4891 scope.go:117] "RemoveContainer" containerID="bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.502355 4891 scope.go:117] "RemoveContainer" containerID="3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e" Sep 29 09:51:18 crc kubenswrapper[4891]: E0929 09:51:18.502857 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e\": container with ID starting with 3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e not found: ID does not exist" containerID="3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.502909 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e"} err="failed to get container status \"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e\": rpc error: code = NotFound desc = could not find container \"3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e\": container with ID starting with 3184e6172379105deaaa9affc143f37e806a6d06d12dfe8601a44604e75fab8e not found: ID does not exist" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.502960 4891 scope.go:117] "RemoveContainer" containerID="3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159" Sep 29 09:51:18 crc kubenswrapper[4891]: E0929 09:51:18.503229 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159\": container with ID starting with 3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159 not found: ID does not exist" containerID="3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.503255 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159"} err="failed to get container status \"3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159\": rpc error: code = NotFound desc = could not find container \"3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159\": container with ID starting with 3dde1e786e7e10b3be7c6818fef93f819636f76c07f326d009b061fcb07a2159 not found: ID does not exist" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.503274 4891 scope.go:117] "RemoveContainer" containerID="bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428" Sep 29 09:51:18 crc kubenswrapper[4891]: E0929 09:51:18.503534 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428\": container with ID starting with bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428 not found: ID does not exist" containerID="bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428" Sep 29 09:51:18 crc kubenswrapper[4891]: I0929 09:51:18.503563 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428"} err="failed to get container status \"bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428\": rpc error: code = NotFound desc = could not find container \"bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428\": container with ID starting with bfd6160f0f19758f42a768b404fffd2e9ed70e9bb960b7c096afc0d050d7b428 not found: ID does not exist" Sep 29 09:51:20 crc kubenswrapper[4891]: I0929 09:51:20.404262 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" path="/var/lib/kubelet/pods/c44276eb-5210-4810-a1ae-eb6a08958f12/volumes" Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.485962 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerStarted","Data":"840b58f75368f59e243db870167fd57d9c2dde0eb3719b186e8b7fb5fd485d07"} Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.492481 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerStarted","Data":"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c"} Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.495081 4891 generic.go:334] "Generic (PLEG): container finished" podID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerID="99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0" exitCode=0 Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.495167 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerDied","Data":"99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0"} Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.497407 4891 generic.go:334] "Generic (PLEG): container finished" podID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerID="426f7e74cbf3fc43190e0cdd6284a692723ccc69faf59580de78d9a7f8c79df3" exitCode=0 Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.497473 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerDied","Data":"426f7e74cbf3fc43190e0cdd6284a692723ccc69faf59580de78d9a7f8c79df3"} Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.501528 4891 generic.go:334] "Generic (PLEG): container finished" podID="564b206f-9094-4670-a3e0-2293b94fe724" containerID="59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553" exitCode=0 Sep 29 09:51:23 crc kubenswrapper[4891]: I0929 09:51:23.501563 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerDied","Data":"59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.507846 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerID="fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c" exitCode=0 Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.507963 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerDied","Data":"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.511019 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerStarted","Data":"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.515359 4891 generic.go:334] "Generic (PLEG): container finished" podID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerID="e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18" exitCode=0 Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.515434 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerDied","Data":"e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.517982 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerStarted","Data":"116853141f5f843a9b5b1bb96e13e04114deca1cb1d0bacfdce93355affc77ed"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.524109 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerStarted","Data":"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.527121 4891 generic.go:334] "Generic (PLEG): container finished" podID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerID="840b58f75368f59e243db870167fd57d9c2dde0eb3719b186e8b7fb5fd485d07" exitCode=0 Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.527174 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerDied","Data":"840b58f75368f59e243db870167fd57d9c2dde0eb3719b186e8b7fb5fd485d07"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.527211 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerStarted","Data":"8b897984eb7db0fc90509244e0de54bf9304241e1265e56c421053b09d09ea73"} Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.565986 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqbbz" podStartSLOduration=3.5296246399999998 podStartE2EDuration="1m3.565966093s" podCreationTimestamp="2025-09-29 09:50:21 +0000 UTC" firstStartedPulling="2025-09-29 09:50:23.909218511 +0000 UTC m=+154.114386832" lastFinishedPulling="2025-09-29 09:51:23.945559964 +0000 UTC m=+214.150728285" observedRunningTime="2025-09-29 09:51:24.559274854 +0000 UTC m=+214.764443195" watchObservedRunningTime="2025-09-29 09:51:24.565966093 +0000 UTC m=+214.771134414" Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.587220 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9n7fc" podStartSLOduration=3.116136864 podStartE2EDuration="1m4.587199067s" podCreationTimestamp="2025-09-29 09:50:20 +0000 UTC" firstStartedPulling="2025-09-29 09:50:22.76289323 +0000 UTC m=+152.968061551" lastFinishedPulling="2025-09-29 09:51:24.233955433 +0000 UTC m=+214.439123754" observedRunningTime="2025-09-29 09:51:24.583436185 +0000 UTC m=+214.788604516" watchObservedRunningTime="2025-09-29 09:51:24.587199067 +0000 UTC m=+214.792367388" Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.664457 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9qxq" podStartSLOduration=2.541068346 podStartE2EDuration="1m1.664433333s" podCreationTimestamp="2025-09-29 09:50:23 +0000 UTC" firstStartedPulling="2025-09-29 09:50:24.945966882 +0000 UTC m=+155.151135203" lastFinishedPulling="2025-09-29 09:51:24.069331869 +0000 UTC m=+214.274500190" observedRunningTime="2025-09-29 09:51:24.661952139 +0000 UTC m=+214.867120480" watchObservedRunningTime="2025-09-29 09:51:24.664433333 +0000 UTC m=+214.869601664" Sep 29 09:51:24 crc kubenswrapper[4891]: I0929 09:51:24.704503 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnw4z" podStartSLOduration=3.6604739950000003 podStartE2EDuration="1m3.704483218s" podCreationTimestamp="2025-09-29 09:50:21 +0000 UTC" firstStartedPulling="2025-09-29 09:50:23.870104132 +0000 UTC m=+154.075272453" lastFinishedPulling="2025-09-29 09:51:23.914113355 +0000 UTC m=+214.119281676" observedRunningTime="2025-09-29 09:51:24.701369515 +0000 UTC m=+214.906537836" watchObservedRunningTime="2025-09-29 09:51:24.704483218 +0000 UTC m=+214.909651549" Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.535328 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerStarted","Data":"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9"} Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.537683 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerStarted","Data":"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c"} Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.540122 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerID="09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0" exitCode=0 Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.540155 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerDied","Data":"09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0"} Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.561537 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lpncm" podStartSLOduration=2.711307874 podStartE2EDuration="1m1.561512451s" podCreationTimestamp="2025-09-29 09:50:24 +0000 UTC" firstStartedPulling="2025-09-29 09:50:26.045296423 +0000 UTC m=+156.250464734" lastFinishedPulling="2025-09-29 09:51:24.89550099 +0000 UTC m=+215.100669311" observedRunningTime="2025-09-29 09:51:25.559265894 +0000 UTC m=+215.764434215" watchObservedRunningTime="2025-09-29 09:51:25.561512451 +0000 UTC m=+215.766680772" Sep 29 09:51:25 crc kubenswrapper[4891]: I0929 09:51:25.598757 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p4zqs" podStartSLOduration=3.416102825 podStartE2EDuration="1m4.598731872s" podCreationTimestamp="2025-09-29 09:50:21 +0000 UTC" firstStartedPulling="2025-09-29 09:50:23.87756897 +0000 UTC m=+154.082737301" lastFinishedPulling="2025-09-29 09:51:25.060198027 +0000 UTC m=+215.265366348" observedRunningTime="2025-09-29 09:51:25.596839726 +0000 UTC m=+215.802008057" watchObservedRunningTime="2025-09-29 09:51:25.598731872 +0000 UTC m=+215.803900193" Sep 29 09:51:26 crc kubenswrapper[4891]: I0929 09:51:26.549988 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerStarted","Data":"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe"} Sep 29 09:51:26 crc kubenswrapper[4891]: I0929 09:51:26.574532 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4tsjt" podStartSLOduration=2.622310359 podStartE2EDuration="1m3.574510499s" podCreationTimestamp="2025-09-29 09:50:23 +0000 UTC" firstStartedPulling="2025-09-29 09:50:24.987919983 +0000 UTC m=+155.193088304" lastFinishedPulling="2025-09-29 09:51:25.940120123 +0000 UTC m=+216.145288444" observedRunningTime="2025-09-29 09:51:26.574446467 +0000 UTC m=+216.779614798" watchObservedRunningTime="2025-09-29 09:51:26.574510499 +0000 UTC m=+216.779678820" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.292141 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.293783 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.339496 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.491268 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.491560 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.537508 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.625330 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.625582 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.638443 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.638537 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.687751 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.950155 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.950202 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:31 crc kubenswrapper[4891]: I0929 09:51:31.991678 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:32 crc kubenswrapper[4891]: I0929 09:51:32.152923 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:51:32 crc kubenswrapper[4891]: I0929 09:51:32.633036 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:32 crc kubenswrapper[4891]: I0929 09:51:32.637842 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.242288 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.473520 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.474682 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.518696 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.639319 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.839416 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.878136 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.878566 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:33 crc kubenswrapper[4891]: I0929 09:51:33.922445 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.443015 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.444033 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.481641 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.601569 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqbbz" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="registry-server" containerID="cri-o://fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982" gracePeriod=2 Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.602699 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vnw4z" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="registry-server" containerID="cri-o://8b897984eb7db0fc90509244e0de54bf9304241e1265e56c421053b09d09ea73" gracePeriod=2 Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.643146 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:51:34 crc kubenswrapper[4891]: I0929 09:51:34.652830 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.185890 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.185985 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.186056 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.186854 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.186921 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3" gracePeriod=600 Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.243521 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.552517 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.616608 4891 generic.go:334] "Generic (PLEG): container finished" podID="564b206f-9094-4670-a3e0-2293b94fe724" containerID="fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982" exitCode=0 Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.616688 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqbbz" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.616723 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerDied","Data":"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982"} Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.616765 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqbbz" event={"ID":"564b206f-9094-4670-a3e0-2293b94fe724","Type":"ContainerDied","Data":"f9f61da44f31b4f0b819428599663d46d56d6717175dffd77f348546ce168b2a"} Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.616840 4891 scope.go:117] "RemoveContainer" containerID="fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.618833 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zf6d\" (UniqueName: \"kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d\") pod \"564b206f-9094-4670-a3e0-2293b94fe724\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.618906 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content\") pod \"564b206f-9094-4670-a3e0-2293b94fe724\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.629080 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d" (OuterVolumeSpecName: "kube-api-access-9zf6d") pod "564b206f-9094-4670-a3e0-2293b94fe724" (UID: "564b206f-9094-4670-a3e0-2293b94fe724"). InnerVolumeSpecName "kube-api-access-9zf6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.637395 4891 generic.go:334] "Generic (PLEG): container finished" podID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerID="8b897984eb7db0fc90509244e0de54bf9304241e1265e56c421053b09d09ea73" exitCode=0 Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.637521 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerDied","Data":"8b897984eb7db0fc90509244e0de54bf9304241e1265e56c421053b09d09ea73"} Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.637563 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnw4z" event={"ID":"cf8f9aa8-4899-46e0-bf4f-c614dcd05804","Type":"ContainerDied","Data":"39a12d4fdf585cea2abc1696e213dc3bfcb396881393624e503ec7a2b3d4277f"} Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.637580 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a12d4fdf585cea2abc1696e213dc3bfcb396881393624e503ec7a2b3d4277f" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.640958 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.648952 4891 scope.go:117] "RemoveContainer" containerID="59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.649200 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3" exitCode=0 Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.649994 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3"} Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.671176 4891 scope.go:117] "RemoveContainer" containerID="b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.696662 4891 scope.go:117] "RemoveContainer" containerID="fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982" Sep 29 09:51:36 crc kubenswrapper[4891]: E0929 09:51:36.698092 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982\": container with ID starting with fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982 not found: ID does not exist" containerID="fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.698136 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982"} err="failed to get container status \"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982\": rpc error: code = NotFound desc = could not find container \"fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982\": container with ID starting with fde97eeb7be110bf96f700551c9769ecbcbd60fbe9fafadfdcb5dee79eca3982 not found: ID does not exist" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.698170 4891 scope.go:117] "RemoveContainer" containerID="59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553" Sep 29 09:51:36 crc kubenswrapper[4891]: E0929 09:51:36.702337 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553\": container with ID starting with 59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553 not found: ID does not exist" containerID="59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.702368 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553"} err="failed to get container status \"59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553\": rpc error: code = NotFound desc = could not find container \"59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553\": container with ID starting with 59f9b58cf9b91698ea97e3595e38c04b42fa95ee4d8268509dac0903710cb553 not found: ID does not exist" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.702386 4891 scope.go:117] "RemoveContainer" containerID="b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824" Sep 29 09:51:36 crc kubenswrapper[4891]: E0929 09:51:36.702623 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824\": container with ID starting with b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824 not found: ID does not exist" containerID="b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.702644 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824"} err="failed to get container status \"b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824\": rpc error: code = NotFound desc = could not find container \"b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824\": container with ID starting with b1aeb5c65cadadf22ef7a0029f82ca86cb10f07b7653fd087abb9ecaed03e824 not found: ID does not exist" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.719985 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities\") pod \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.720048 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content\") pod \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.720103 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqfd\" (UniqueName: \"kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd\") pod \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\" (UID: \"cf8f9aa8-4899-46e0-bf4f-c614dcd05804\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.720154 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities\") pod \"564b206f-9094-4670-a3e0-2293b94fe724\" (UID: \"564b206f-9094-4670-a3e0-2293b94fe724\") " Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.720390 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zf6d\" (UniqueName: \"kubernetes.io/projected/564b206f-9094-4670-a3e0-2293b94fe724-kube-api-access-9zf6d\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.722044 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities" (OuterVolumeSpecName: "utilities") pod "564b206f-9094-4670-a3e0-2293b94fe724" (UID: "564b206f-9094-4670-a3e0-2293b94fe724"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.722574 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities" (OuterVolumeSpecName: "utilities") pod "cf8f9aa8-4899-46e0-bf4f-c614dcd05804" (UID: "cf8f9aa8-4899-46e0-bf4f-c614dcd05804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.724449 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd" (OuterVolumeSpecName: "kube-api-access-9gqfd") pod "cf8f9aa8-4899-46e0-bf4f-c614dcd05804" (UID: "cf8f9aa8-4899-46e0-bf4f-c614dcd05804"). InnerVolumeSpecName "kube-api-access-9gqfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.821647 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqfd\" (UniqueName: \"kubernetes.io/projected/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-kube-api-access-9gqfd\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.821686 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:36 crc kubenswrapper[4891]: I0929 09:51:36.821703 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.455613 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "564b206f-9094-4670-a3e0-2293b94fe724" (UID: "564b206f-9094-4670-a3e0-2293b94fe724"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.531952 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564b206f-9094-4670-a3e0-2293b94fe724-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.548994 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.553124 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqbbz"] Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.656749 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c9qxq" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="registry-server" containerID="cri-o://116853141f5f843a9b5b1bb96e13e04114deca1cb1d0bacfdce93355affc77ed" gracePeriod=2 Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.656879 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnw4z" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.680941 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf8f9aa8-4899-46e0-bf4f-c614dcd05804" (UID: "cf8f9aa8-4899-46e0-bf4f-c614dcd05804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.734291 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8f9aa8-4899-46e0-bf4f-c614dcd05804-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:37 crc kubenswrapper[4891]: I0929 09:51:37.990254 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.013813 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vnw4z"] Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.402609 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564b206f-9094-4670-a3e0-2293b94fe724" path="/var/lib/kubelet/pods/564b206f-9094-4670-a3e0-2293b94fe724/volumes" Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.404099 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" path="/var/lib/kubelet/pods/cf8f9aa8-4899-46e0-bf4f-c614dcd05804/volumes" Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.663440 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6"} Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.665637 4891 generic.go:334] "Generic (PLEG): container finished" podID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerID="116853141f5f843a9b5b1bb96e13e04114deca1cb1d0bacfdce93355affc77ed" exitCode=0 Sep 29 09:51:38 crc kubenswrapper[4891]: I0929 09:51:38.665678 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerDied","Data":"116853141f5f843a9b5b1bb96e13e04114deca1cb1d0bacfdce93355affc77ed"} Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.595579 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.674378 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9qxq" event={"ID":"72906240-c8d2-48b3-b36b-1c761b33c1ec","Type":"ContainerDied","Data":"9844dcf8920159f0a3c9cf6b08fe2923717dab7ecb70dcf3adbea5d2725e2a7f"} Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.674847 4891 scope.go:117] "RemoveContainer" containerID="116853141f5f843a9b5b1bb96e13e04114deca1cb1d0bacfdce93355affc77ed" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.675110 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9qxq" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.708078 4891 scope.go:117] "RemoveContainer" containerID="426f7e74cbf3fc43190e0cdd6284a692723ccc69faf59580de78d9a7f8c79df3" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.727495 4891 scope.go:117] "RemoveContainer" containerID="0dd43d35974204b72dbd98c30b859c8e8c001a14cd8351636cc3e4a814e125dc" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.765468 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content\") pod \"72906240-c8d2-48b3-b36b-1c761b33c1ec\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.765573 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj6xb\" (UniqueName: \"kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb\") pod \"72906240-c8d2-48b3-b36b-1c761b33c1ec\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.765614 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities\") pod \"72906240-c8d2-48b3-b36b-1c761b33c1ec\" (UID: \"72906240-c8d2-48b3-b36b-1c761b33c1ec\") " Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.767083 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities" (OuterVolumeSpecName: "utilities") pod "72906240-c8d2-48b3-b36b-1c761b33c1ec" (UID: "72906240-c8d2-48b3-b36b-1c761b33c1ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.772262 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb" (OuterVolumeSpecName: "kube-api-access-kj6xb") pod "72906240-c8d2-48b3-b36b-1c761b33c1ec" (UID: "72906240-c8d2-48b3-b36b-1c761b33c1ec"). InnerVolumeSpecName "kube-api-access-kj6xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.781627 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72906240-c8d2-48b3-b36b-1c761b33c1ec" (UID: "72906240-c8d2-48b3-b36b-1c761b33c1ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.867111 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj6xb\" (UniqueName: \"kubernetes.io/projected/72906240-c8d2-48b3-b36b-1c761b33c1ec-kube-api-access-kj6xb\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.867160 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:39 crc kubenswrapper[4891]: I0929 09:51:39.867169 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72906240-c8d2-48b3-b36b-1c761b33c1ec-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:40 crc kubenswrapper[4891]: I0929 09:51:40.011917 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:51:40 crc kubenswrapper[4891]: I0929 09:51:40.014942 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9qxq"] Sep 29 09:51:40 crc kubenswrapper[4891]: I0929 09:51:40.403475 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" path="/var/lib/kubelet/pods/72906240-c8d2-48b3-b36b-1c761b33c1ec/volumes" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.189194 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" containerID="cri-o://bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311" gracePeriod=15 Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.535871 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.570608 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-sx6bz"] Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572002 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572036 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572051 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572057 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572066 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a498a4fe-29bb-4868-80e7-2576fd472ac8" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572072 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a498a4fe-29bb-4868-80e7-2576fd472ac8" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572079 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572085 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572094 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572101 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572110 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572119 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572141 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572149 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572158 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572165 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572177 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572183 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572194 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572203 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572212 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572219 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572227 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572234 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="extract-utilities" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572243 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572249 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572263 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572269 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="extract-content" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.572279 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87cee97-3326-4334-95e7-da15db1dbf12" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572285 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87cee97-3326-4334-95e7-da15db1dbf12" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572496 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="72906240-c8d2-48b3-b36b-1c761b33c1ec" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572516 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a498a4fe-29bb-4868-80e7-2576fd472ac8" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572526 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerName="oauth-openshift" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572541 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8f9aa8-4899-46e0-bf4f-c614dcd05804" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572551 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87cee97-3326-4334-95e7-da15db1dbf12" containerName="pruner" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572562 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44276eb-5210-4810-a1ae-eb6a08958f12" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.572569 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="564b206f-9094-4670-a3e0-2293b94fe724" containerName="registry-server" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.573133 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.582640 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-sx6bz"] Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.722959 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.723319 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.723421 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.723517 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.723612 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.723915 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.724050 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94twf\" (UniqueName: \"kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.724179 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.724896 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.724920 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.725139 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.730961 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731195 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf" (OuterVolumeSpecName: "kube-api-access-94twf") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "kube-api-access-94twf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731232 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731287 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731391 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731495 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731601 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.731669 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir\") pod \"017fcc94-98f6-4abd-9954-c3212676f6e7\" (UID: \"017fcc94-98f6-4abd-9954-c3212676f6e7\") " Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732283 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ttt\" (UniqueName: \"kubernetes.io/projected/52fad818-cab8-4a4b-b17f-71ad155b1801-kube-api-access-c6ttt\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732362 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732502 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732559 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-policies\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732617 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732666 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-dir\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732770 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732930 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.732999 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733051 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733126 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733172 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733223 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733282 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733391 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733422 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733446 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733466 4891 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.733510 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94twf\" (UniqueName: \"kubernetes.io/projected/017fcc94-98f6-4abd-9954-c3212676f6e7-kube-api-access-94twf\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.734524 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.734992 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.735604 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.735649 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.736569 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.740152 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.742713 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.743287 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.746439 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "017fcc94-98f6-4abd-9954-c3212676f6e7" (UID: "017fcc94-98f6-4abd-9954-c3212676f6e7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.790128 4891 generic.go:334] "Generic (PLEG): container finished" podID="017fcc94-98f6-4abd-9954-c3212676f6e7" containerID="bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311" exitCode=0 Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.790174 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" event={"ID":"017fcc94-98f6-4abd-9954-c3212676f6e7","Type":"ContainerDied","Data":"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311"} Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.790203 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" event={"ID":"017fcc94-98f6-4abd-9954-c3212676f6e7","Type":"ContainerDied","Data":"560872879b41c041248bbe68368ebaf67ba7ad25b27bf24df1be5e1367d888f5"} Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.790224 4891 scope.go:117] "RemoveContainer" containerID="bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.790241 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cd8dx" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.814572 4891 scope.go:117] "RemoveContainer" containerID="bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311" Sep 29 09:51:57 crc kubenswrapper[4891]: E0929 09:51:57.815517 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311\": container with ID starting with bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311 not found: ID does not exist" containerID="bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.815584 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311"} err="failed to get container status \"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311\": rpc error: code = NotFound desc = could not find container \"bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311\": container with ID starting with bc0d6ee619c963baa697b2390041e160deea3a35411e5a592cd07cb3bc397311 not found: ID does not exist" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.823108 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.827452 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cd8dx"] Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834526 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ttt\" (UniqueName: \"kubernetes.io/projected/52fad818-cab8-4a4b-b17f-71ad155b1801-kube-api-access-c6ttt\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834565 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834627 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834651 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-policies\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834677 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834703 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-dir\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834745 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834775 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834821 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834843 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834876 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834901 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834926 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.834949 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.835102 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.835156 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-dir\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.835699 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-audit-policies\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.835945 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836485 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836532 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836547 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836564 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836575 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836588 4891 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/017fcc94-98f6-4abd-9954-c3212676f6e7-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836602 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836616 4891 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/017fcc94-98f6-4abd-9954-c3212676f6e7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.836703 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.838592 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.838729 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.838782 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.839069 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.839388 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.839682 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.840035 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.840209 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.840974 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52fad818-cab8-4a4b-b17f-71ad155b1801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.851312 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ttt\" (UniqueName: \"kubernetes.io/projected/52fad818-cab8-4a4b-b17f-71ad155b1801-kube-api-access-c6ttt\") pod \"oauth-openshift-6499b46898-sx6bz\" (UID: \"52fad818-cab8-4a4b-b17f-71ad155b1801\") " pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:57 crc kubenswrapper[4891]: I0929 09:51:57.904066 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.313234 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-sx6bz"] Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.404710 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017fcc94-98f6-4abd-9954-c3212676f6e7" path="/var/lib/kubelet/pods/017fcc94-98f6-4abd-9954-c3212676f6e7/volumes" Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.796889 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" event={"ID":"52fad818-cab8-4a4b-b17f-71ad155b1801","Type":"ContainerStarted","Data":"644ac5c5036a35a1e1120c6103205ff39f18ed5afcaa4e0f888669b9a283845c"} Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.796945 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" event={"ID":"52fad818-cab8-4a4b-b17f-71ad155b1801","Type":"ContainerStarted","Data":"d132e4dc1f44acda4c6c020864d194aa302847dd630ac966d59d42bb8a60ec55"} Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.796966 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:51:58 crc kubenswrapper[4891]: I0929 09:51:58.816540 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" podStartSLOduration=26.816522913 podStartE2EDuration="26.816522913s" podCreationTimestamp="2025-09-29 09:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:51:58.814039069 +0000 UTC m=+249.019207400" watchObservedRunningTime="2025-09-29 09:51:58.816522913 +0000 UTC m=+249.021691244" Sep 29 09:51:59 crc kubenswrapper[4891]: I0929 09:51:59.152060 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6499b46898-sx6bz" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.312460 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.313704 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9n7fc" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="registry-server" containerID="cri-o://2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985" gracePeriod=30 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.319155 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.319437 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p4zqs" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="registry-server" containerID="cri-o://0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c" gracePeriod=30 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.332465 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.332710 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" containerID="cri-o://47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb" gracePeriod=30 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.348643 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.348938 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4tsjt" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="registry-server" containerID="cri-o://350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe" gracePeriod=30 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.363661 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfmkm"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.364547 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.379876 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.380169 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lpncm" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="registry-server" containerID="cri-o://c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9" gracePeriod=30 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.384723 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfmkm"] Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.422635 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlljx\" (UniqueName: \"kubernetes.io/projected/bcfef239-c4e7-43c6-92f3-2092cd28922b-kube-api-access-jlljx\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.422779 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.422965 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.524709 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.524784 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.525444 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlljx\" (UniqueName: \"kubernetes.io/projected/bcfef239-c4e7-43c6-92f3-2092cd28922b-kube-api-access-jlljx\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.526959 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.532586 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcfef239-c4e7-43c6-92f3-2092cd28922b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.546901 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlljx\" (UniqueName: \"kubernetes.io/projected/bcfef239-c4e7-43c6-92f3-2092cd28922b-kube-api-access-jlljx\") pod \"marketplace-operator-79b997595-gfmkm\" (UID: \"bcfef239-c4e7-43c6-92f3-2092cd28922b\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.800290 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.804426 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.861889 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.889155 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.896114 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.912713 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934016 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp9mf\" (UniqueName: \"kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf\") pod \"ea8331db-bf12-47e7-80c8-abd1d766b214\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934061 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wl6j\" (UniqueName: \"kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j\") pod \"ea6d0237-b356-48d2-bc1b-e28b47089506\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934083 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics\") pod \"492762a9-9e7d-4095-b2f4-990f58b82d21\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934110 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq6jj\" (UniqueName: \"kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj\") pod \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934143 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content\") pod \"ea8331db-bf12-47e7-80c8-abd1d766b214\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934164 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca\") pod \"492762a9-9e7d-4095-b2f4-990f58b82d21\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934187 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities\") pod \"ea6d0237-b356-48d2-bc1b-e28b47089506\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934219 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s8ms\" (UniqueName: \"kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms\") pod \"492762a9-9e7d-4095-b2f4-990f58b82d21\" (UID: \"492762a9-9e7d-4095-b2f4-990f58b82d21\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934238 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content\") pod \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934254 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn26q\" (UniqueName: \"kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q\") pod \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934270 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities\") pod \"ea8331db-bf12-47e7-80c8-abd1d766b214\" (UID: \"ea8331db-bf12-47e7-80c8-abd1d766b214\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934287 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content\") pod \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934307 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities\") pod \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\" (UID: \"b457bc1b-81dc-4e12-a955-7cf02a8a03e6\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934328 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content\") pod \"ea6d0237-b356-48d2-bc1b-e28b47089506\" (UID: \"ea6d0237-b356-48d2-bc1b-e28b47089506\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.934358 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities\") pod \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\" (UID: \"255a721a-d660-405d-a4d4-9f1cfdc6bb76\") " Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.936347 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities" (OuterVolumeSpecName: "utilities") pod "ea6d0237-b356-48d2-bc1b-e28b47089506" (UID: "ea6d0237-b356-48d2-bc1b-e28b47089506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.936941 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "492762a9-9e7d-4095-b2f4-990f58b82d21" (UID: "492762a9-9e7d-4095-b2f4-990f58b82d21"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.939242 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf" (OuterVolumeSpecName: "kube-api-access-rp9mf") pod "ea8331db-bf12-47e7-80c8-abd1d766b214" (UID: "ea8331db-bf12-47e7-80c8-abd1d766b214"). InnerVolumeSpecName "kube-api-access-rp9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.940641 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities" (OuterVolumeSpecName: "utilities") pod "255a721a-d660-405d-a4d4-9f1cfdc6bb76" (UID: "255a721a-d660-405d-a4d4-9f1cfdc6bb76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.941251 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities" (OuterVolumeSpecName: "utilities") pod "ea8331db-bf12-47e7-80c8-abd1d766b214" (UID: "ea8331db-bf12-47e7-80c8-abd1d766b214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.941567 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "492762a9-9e7d-4095-b2f4-990f58b82d21" (UID: "492762a9-9e7d-4095-b2f4-990f58b82d21"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.943699 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j" (OuterVolumeSpecName: "kube-api-access-8wl6j") pod "ea6d0237-b356-48d2-bc1b-e28b47089506" (UID: "ea6d0237-b356-48d2-bc1b-e28b47089506"). InnerVolumeSpecName "kube-api-access-8wl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.944455 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q" (OuterVolumeSpecName: "kube-api-access-qn26q") pod "255a721a-d660-405d-a4d4-9f1cfdc6bb76" (UID: "255a721a-d660-405d-a4d4-9f1cfdc6bb76"). InnerVolumeSpecName "kube-api-access-qn26q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.951357 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities" (OuterVolumeSpecName: "utilities") pod "b457bc1b-81dc-4e12-a955-7cf02a8a03e6" (UID: "b457bc1b-81dc-4e12-a955-7cf02a8a03e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.974952 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms" (OuterVolumeSpecName: "kube-api-access-5s8ms") pod "492762a9-9e7d-4095-b2f4-990f58b82d21" (UID: "492762a9-9e7d-4095-b2f4-990f58b82d21"). InnerVolumeSpecName "kube-api-access-5s8ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.975861 4891 generic.go:334] "Generic (PLEG): container finished" podID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerID="47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb" exitCode=0 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.976026 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.976165 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" event={"ID":"492762a9-9e7d-4095-b2f4-990f58b82d21","Type":"ContainerDied","Data":"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.976232 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-96vcr" event={"ID":"492762a9-9e7d-4095-b2f4-990f58b82d21","Type":"ContainerDied","Data":"09f8e29e1c8bcd1d38ebd19ffcbfdb62e33b53c6edb3854fa30c2ae389511161"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.976252 4891 scope.go:117] "RemoveContainer" containerID="47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.981751 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea6d0237-b356-48d2-bc1b-e28b47089506" (UID: "ea6d0237-b356-48d2-bc1b-e28b47089506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.982046 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerID="c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9" exitCode=0 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.982099 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerDied","Data":"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.982125 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpncm" event={"ID":"ea8331db-bf12-47e7-80c8-abd1d766b214","Type":"ContainerDied","Data":"c488fa2d92713076a03bca6c42e9c37b947bd838f1a942ddc0aa283f331c9713"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.982184 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpncm" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.994039 4891 generic.go:334] "Generic (PLEG): container finished" podID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerID="2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985" exitCode=0 Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.994125 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerDied","Data":"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.994136 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n7fc" Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.994167 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n7fc" event={"ID":"b457bc1b-81dc-4e12-a955-7cf02a8a03e6","Type":"ContainerDied","Data":"f017faa7182887872ba59f760fa41e7a0aa26e50828348dba4f55a6dabdadcc8"} Sep 29 09:52:25 crc kubenswrapper[4891]: I0929 09:52:25.997904 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj" (OuterVolumeSpecName: "kube-api-access-gq6jj") pod "b457bc1b-81dc-4e12-a955-7cf02a8a03e6" (UID: "b457bc1b-81dc-4e12-a955-7cf02a8a03e6"). InnerVolumeSpecName "kube-api-access-gq6jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.005996 4891 generic.go:334] "Generic (PLEG): container finished" podID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerID="0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c" exitCode=0 Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.006169 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerDied","Data":"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c"} Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.006229 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4zqs" event={"ID":"255a721a-d660-405d-a4d4-9f1cfdc6bb76","Type":"ContainerDied","Data":"621dc79f0afdfaed5938cc73f6afa0d5dda9f9e22c6a923222d00bb76a780bb6"} Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.006233 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4zqs" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.013245 4891 scope.go:117] "RemoveContainer" containerID="47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.014406 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb\": container with ID starting with 47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb not found: ID does not exist" containerID="47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.014441 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb"} err="failed to get container status \"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb\": rpc error: code = NotFound desc = could not find container \"47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb\": container with ID starting with 47e67fe98c4960879191cbdf91b2edc5054b822466a5eb583ee2722d08d0dfeb not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.014485 4891 scope.go:117] "RemoveContainer" containerID="c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.015274 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerID="350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe" exitCode=0 Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.015365 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4tsjt" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.017589 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.017645 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-96vcr"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.017662 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerDied","Data":"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe"} Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.017691 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4tsjt" event={"ID":"ea6d0237-b356-48d2-bc1b-e28b47089506","Type":"ContainerDied","Data":"784ce309adcd765ed30ee43cd2bc1ecdb452720763e1af264269d706d71b6342"} Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.021328 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "255a721a-d660-405d-a4d4-9f1cfdc6bb76" (UID: "255a721a-d660-405d-a4d4-9f1cfdc6bb76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.033247 4891 scope.go:117] "RemoveContainer" containerID="fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036019 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp9mf\" (UniqueName: \"kubernetes.io/projected/ea8331db-bf12-47e7-80c8-abd1d766b214-kube-api-access-rp9mf\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036044 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wl6j\" (UniqueName: \"kubernetes.io/projected/ea6d0237-b356-48d2-bc1b-e28b47089506-kube-api-access-8wl6j\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036055 4891 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036066 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq6jj\" (UniqueName: \"kubernetes.io/projected/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-kube-api-access-gq6jj\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036075 4891 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/492762a9-9e7d-4095-b2f4-990f58b82d21-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036084 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036093 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s8ms\" (UniqueName: \"kubernetes.io/projected/492762a9-9e7d-4095-b2f4-990f58b82d21-kube-api-access-5s8ms\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036102 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036112 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn26q\" (UniqueName: \"kubernetes.io/projected/255a721a-d660-405d-a4d4-9f1cfdc6bb76-kube-api-access-qn26q\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036123 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036133 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036142 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6d0237-b356-48d2-bc1b-e28b47089506-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.036150 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255a721a-d660-405d-a4d4-9f1cfdc6bb76-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.050110 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.053521 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4tsjt"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.064902 4891 scope.go:117] "RemoveContainer" containerID="eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.070374 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b457bc1b-81dc-4e12-a955-7cf02a8a03e6" (UID: "b457bc1b-81dc-4e12-a955-7cf02a8a03e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.077808 4891 scope.go:117] "RemoveContainer" containerID="c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.080264 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9\": container with ID starting with c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9 not found: ID does not exist" containerID="c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.080312 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9"} err="failed to get container status \"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9\": rpc error: code = NotFound desc = could not find container \"c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9\": container with ID starting with c0712f5bdb0c20f675f0d8cbe5cab48094a5f579a21e81645a7eaa35942bf1e9 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.080351 4891 scope.go:117] "RemoveContainer" containerID="fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.080703 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c\": container with ID starting with fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c not found: ID does not exist" containerID="fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.080929 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c"} err="failed to get container status \"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c\": rpc error: code = NotFound desc = could not find container \"fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c\": container with ID starting with fe793464d62c32584b3e470849675bb6851140c34323f75f971981304cca452c not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.081031 4891 scope.go:117] "RemoveContainer" containerID="eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.081416 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5\": container with ID starting with eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5 not found: ID does not exist" containerID="eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.081462 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5"} err="failed to get container status \"eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5\": rpc error: code = NotFound desc = could not find container \"eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5\": container with ID starting with eb0fdf38e1a7eb9a9fff315e7166e5945d7e0fabef85cf064e3921482bf4dec5 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.081503 4891 scope.go:117] "RemoveContainer" containerID="2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.096268 4891 scope.go:117] "RemoveContainer" containerID="99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.112705 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea8331db-bf12-47e7-80c8-abd1d766b214" (UID: "ea8331db-bf12-47e7-80c8-abd1d766b214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.115858 4891 scope.go:117] "RemoveContainer" containerID="98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.132970 4891 scope.go:117] "RemoveContainer" containerID="2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.133317 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985\": container with ID starting with 2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985 not found: ID does not exist" containerID="2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.133349 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985"} err="failed to get container status \"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985\": rpc error: code = NotFound desc = could not find container \"2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985\": container with ID starting with 2e8d9a36624b9868e047b489f161063ddb3b0ecb7dd3c6d0ffdcd36b5cfc5985 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.133371 4891 scope.go:117] "RemoveContainer" containerID="99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.133620 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0\": container with ID starting with 99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0 not found: ID does not exist" containerID="99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.133651 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0"} err="failed to get container status \"99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0\": rpc error: code = NotFound desc = could not find container \"99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0\": container with ID starting with 99b2ad5e55de862770d37b3308fc5b0d94901cb53f2667f1c69450fa78d425e0 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.133670 4891 scope.go:117] "RemoveContainer" containerID="98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.134124 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235\": container with ID starting with 98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235 not found: ID does not exist" containerID="98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.134148 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235"} err="failed to get container status \"98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235\": rpc error: code = NotFound desc = could not find container \"98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235\": container with ID starting with 98c626c156f83d326fff7b198fab71a72b37ca6e807ce817d290e42d5697f235 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.134164 4891 scope.go:117] "RemoveContainer" containerID="0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.137545 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8331db-bf12-47e7-80c8-abd1d766b214-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.137569 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b457bc1b-81dc-4e12-a955-7cf02a8a03e6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.145018 4891 scope.go:117] "RemoveContainer" containerID="e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.159870 4891 scope.go:117] "RemoveContainer" containerID="d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.173861 4891 scope.go:117] "RemoveContainer" containerID="0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.174311 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c\": container with ID starting with 0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c not found: ID does not exist" containerID="0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.174385 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c"} err="failed to get container status \"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c\": rpc error: code = NotFound desc = could not find container \"0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c\": container with ID starting with 0ca5d8219f6c43350049cef0409a9cf8ecff78f9f805226d06b5cab7163e005c not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.174415 4891 scope.go:117] "RemoveContainer" containerID="e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.174811 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18\": container with ID starting with e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18 not found: ID does not exist" containerID="e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.174842 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18"} err="failed to get container status \"e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18\": rpc error: code = NotFound desc = could not find container \"e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18\": container with ID starting with e22bcb38f8c4733217facd4f6774b89f6f78639a39c68c9b7a4333c41542db18 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.174866 4891 scope.go:117] "RemoveContainer" containerID="d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.175107 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f\": container with ID starting with d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f not found: ID does not exist" containerID="d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.175125 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f"} err="failed to get container status \"d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f\": rpc error: code = NotFound desc = could not find container \"d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f\": container with ID starting with d4a5ae7e6a131c3a8329b50d241cdca292e97bfac3345013915d5156476da41f not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.175136 4891 scope.go:117] "RemoveContainer" containerID="350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.188425 4891 scope.go:117] "RemoveContainer" containerID="09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.201852 4891 scope.go:117] "RemoveContainer" containerID="7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.215105 4891 scope.go:117] "RemoveContainer" containerID="350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.215834 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe\": container with ID starting with 350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe not found: ID does not exist" containerID="350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.215885 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe"} err="failed to get container status \"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe\": rpc error: code = NotFound desc = could not find container \"350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe\": container with ID starting with 350876b60c290718fa7b809e88c42017513c0564f09245e2c3cdfb74d9de3fbe not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.215918 4891 scope.go:117] "RemoveContainer" containerID="09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.216415 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0\": container with ID starting with 09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0 not found: ID does not exist" containerID="09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.216484 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0"} err="failed to get container status \"09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0\": rpc error: code = NotFound desc = could not find container \"09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0\": container with ID starting with 09f7ce7792a0263f1fffae20599bc9bd5c1a2e4cc92df364324650dea6dfb0b0 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.216532 4891 scope.go:117] "RemoveContainer" containerID="7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343" Sep 29 09:52:26 crc kubenswrapper[4891]: E0929 09:52:26.217345 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343\": container with ID starting with 7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343 not found: ID does not exist" containerID="7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.217385 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343"} err="failed to get container status \"7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343\": rpc error: code = NotFound desc = could not find container \"7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343\": container with ID starting with 7f7f067dc28ba6d6ae17f6ed8ff6de46db326948fb4f48ded40675d864774343 not found: ID does not exist" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.312689 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.315069 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lpncm"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.327556 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.331284 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9n7fc"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.342990 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfmkm"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.363114 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.368469 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p4zqs"] Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.403134 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" path="/var/lib/kubelet/pods/255a721a-d660-405d-a4d4-9f1cfdc6bb76/volumes" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.403778 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" path="/var/lib/kubelet/pods/492762a9-9e7d-4095-b2f4-990f58b82d21/volumes" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.404238 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" path="/var/lib/kubelet/pods/b457bc1b-81dc-4e12-a955-7cf02a8a03e6/volumes" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.405279 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" path="/var/lib/kubelet/pods/ea6d0237-b356-48d2-bc1b-e28b47089506/volumes" Sep 29 09:52:26 crc kubenswrapper[4891]: I0929 09:52:26.405887 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" path="/var/lib/kubelet/pods/ea8331db-bf12-47e7-80c8-abd1d766b214/volumes" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.027590 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" event={"ID":"bcfef239-c4e7-43c6-92f3-2092cd28922b","Type":"ContainerStarted","Data":"14a67d940de1c5fb7959ed3a3e68793f67bf58884c68c112c42f552c845e38c9"} Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.028155 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" event={"ID":"bcfef239-c4e7-43c6-92f3-2092cd28922b","Type":"ContainerStarted","Data":"2bab08bdb57abb145eefd61623c1e947b70ce1abb42727b672c9d4cba707d93f"} Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.028184 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.033711 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.046504 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gfmkm" podStartSLOduration=2.046486428 podStartE2EDuration="2.046486428s" podCreationTimestamp="2025-09-29 09:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:52:27.043872549 +0000 UTC m=+277.249040880" watchObservedRunningTime="2025-09-29 09:52:27.046486428 +0000 UTC m=+277.251654749" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527582 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxqvf"] Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527806 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527818 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527829 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527835 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527844 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527850 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527860 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527866 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527874 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527880 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527891 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527897 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527907 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527913 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527921 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527928 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527938 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527945 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527953 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527962 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527971 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527977 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527984 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.527990 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="extract-content" Sep 29 09:52:27 crc kubenswrapper[4891]: E0929 09:52:27.527999 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528004 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="extract-utilities" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528080 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="b457bc1b-81dc-4e12-a955-7cf02a8a03e6" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528090 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6d0237-b356-48d2-bc1b-e28b47089506" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528103 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8331db-bf12-47e7-80c8-abd1d766b214" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528110 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="492762a9-9e7d-4095-b2f4-990f58b82d21" containerName="marketplace-operator" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528118 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="255a721a-d660-405d-a4d4-9f1cfdc6bb76" containerName="registry-server" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.528808 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.533008 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.539525 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxqvf"] Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.558139 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-utilities\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.558518 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-catalog-content\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.558645 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvbp\" (UniqueName: \"kubernetes.io/projected/c751fcd1-3522-4572-a3f1-52acfab7c45d-kube-api-access-2wvbp\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.659328 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-utilities\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.659400 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-catalog-content\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.659430 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvbp\" (UniqueName: \"kubernetes.io/projected/c751fcd1-3522-4572-a3f1-52acfab7c45d-kube-api-access-2wvbp\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.659932 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-catalog-content\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.660024 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c751fcd1-3522-4572-a3f1-52acfab7c45d-utilities\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.682132 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvbp\" (UniqueName: \"kubernetes.io/projected/c751fcd1-3522-4572-a3f1-52acfab7c45d-kube-api-access-2wvbp\") pod \"certified-operators-xxqvf\" (UID: \"c751fcd1-3522-4572-a3f1-52acfab7c45d\") " pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.725769 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5txtz"] Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.727026 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.733549 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.737530 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5txtz"] Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.845692 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.881554 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4hw\" (UniqueName: \"kubernetes.io/projected/33c8f323-70e1-4e60-aeb7-f512c245885e-kube-api-access-vt4hw\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.881653 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-utilities\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.881693 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-catalog-content\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.982583 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-catalog-content\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.983185 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4hw\" (UniqueName: \"kubernetes.io/projected/33c8f323-70e1-4e60-aeb7-f512c245885e-kube-api-access-vt4hw\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.983247 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-utilities\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.983586 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-catalog-content\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:27 crc kubenswrapper[4891]: I0929 09:52:27.983712 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c8f323-70e1-4e60-aeb7-f512c245885e-utilities\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:28 crc kubenswrapper[4891]: I0929 09:52:28.005643 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4hw\" (UniqueName: \"kubernetes.io/projected/33c8f323-70e1-4e60-aeb7-f512c245885e-kube-api-access-vt4hw\") pod \"community-operators-5txtz\" (UID: \"33c8f323-70e1-4e60-aeb7-f512c245885e\") " pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:28 crc kubenswrapper[4891]: I0929 09:52:28.102853 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:28 crc kubenswrapper[4891]: I0929 09:52:28.262696 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxqvf"] Sep 29 09:52:28 crc kubenswrapper[4891]: W0929 09:52:28.267279 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc751fcd1_3522_4572_a3f1_52acfab7c45d.slice/crio-c8fed814800a0ef8f5a0f320806c11fc886ab7150b43d3d69a7c868b7bd2f786 WatchSource:0}: Error finding container c8fed814800a0ef8f5a0f320806c11fc886ab7150b43d3d69a7c868b7bd2f786: Status 404 returned error can't find the container with id c8fed814800a0ef8f5a0f320806c11fc886ab7150b43d3d69a7c868b7bd2f786 Sep 29 09:52:28 crc kubenswrapper[4891]: I0929 09:52:28.488627 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5txtz"] Sep 29 09:52:28 crc kubenswrapper[4891]: W0929 09:52:28.495505 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33c8f323_70e1_4e60_aeb7_f512c245885e.slice/crio-ce15267e0db3859a5eaf3d13ab9a77f4bcd9864a659ab48ec647a4468824517f WatchSource:0}: Error finding container ce15267e0db3859a5eaf3d13ab9a77f4bcd9864a659ab48ec647a4468824517f: Status 404 returned error can't find the container with id ce15267e0db3859a5eaf3d13ab9a77f4bcd9864a659ab48ec647a4468824517f Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.044466 4891 generic.go:334] "Generic (PLEG): container finished" podID="33c8f323-70e1-4e60-aeb7-f512c245885e" containerID="86adc54752365f8e28812e3bf755b365aa228f41c6969a0a58b64db50e366720" exitCode=0 Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.044530 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5txtz" event={"ID":"33c8f323-70e1-4e60-aeb7-f512c245885e","Type":"ContainerDied","Data":"86adc54752365f8e28812e3bf755b365aa228f41c6969a0a58b64db50e366720"} Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.044592 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5txtz" event={"ID":"33c8f323-70e1-4e60-aeb7-f512c245885e","Type":"ContainerStarted","Data":"ce15267e0db3859a5eaf3d13ab9a77f4bcd9864a659ab48ec647a4468824517f"} Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.048697 4891 generic.go:334] "Generic (PLEG): container finished" podID="c751fcd1-3522-4572-a3f1-52acfab7c45d" containerID="55e56849dfdffa4e7ac53b52c40c608caf925a84c7ebf3663d6de6f76be75c96" exitCode=0 Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.050122 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxqvf" event={"ID":"c751fcd1-3522-4572-a3f1-52acfab7c45d","Type":"ContainerDied","Data":"55e56849dfdffa4e7ac53b52c40c608caf925a84c7ebf3663d6de6f76be75c96"} Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.050147 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxqvf" event={"ID":"c751fcd1-3522-4572-a3f1-52acfab7c45d","Type":"ContainerStarted","Data":"c8fed814800a0ef8f5a0f320806c11fc886ab7150b43d3d69a7c868b7bd2f786"} Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.944647 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j656v"] Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.955607 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j656v"] Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.956297 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:29 crc kubenswrapper[4891]: I0929 09:52:29.958732 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.018400 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r956x\" (UniqueName: \"kubernetes.io/projected/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-kube-api-access-r956x\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.018468 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-catalog-content\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.018493 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-utilities\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.058063 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5txtz" event={"ID":"33c8f323-70e1-4e60-aeb7-f512c245885e","Type":"ContainerStarted","Data":"c229803dbc245da108a5f2d3b6a965bbf3d18e1b3d53067ce97f1a8d7fac2aaf"} Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.060159 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxqvf" event={"ID":"c751fcd1-3522-4572-a3f1-52acfab7c45d","Type":"ContainerStarted","Data":"388f12f36685c658e3ae4564c181f695f8c94eeff53af7e7de49ce9c8fb2843a"} Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.120366 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-catalog-content\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.120444 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-utilities\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.120557 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r956x\" (UniqueName: \"kubernetes.io/projected/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-kube-api-access-r956x\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.121217 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-catalog-content\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.121233 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-utilities\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.127141 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.128198 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.130920 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.141269 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.154032 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r956x\" (UniqueName: \"kubernetes.io/projected/5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe-kube-api-access-r956x\") pod \"redhat-marketplace-j656v\" (UID: \"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe\") " pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.222304 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.222381 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pl9v\" (UniqueName: \"kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.222410 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.318431 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.323416 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pl9v\" (UniqueName: \"kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.323578 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.323753 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.324414 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.324501 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.342916 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pl9v\" (UniqueName: \"kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v\") pod \"redhat-operators-sn22t\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.448337 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.657535 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 09:52:30 crc kubenswrapper[4891]: W0929 09:52:30.666147 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2eff33_cdda_491e_a057_a6b1e0a2bd10.slice/crio-ae04ef6659973c8881039a413d05c2718d76ff56a1cd1498b683f705f80a111e WatchSource:0}: Error finding container ae04ef6659973c8881039a413d05c2718d76ff56a1cd1498b683f705f80a111e: Status 404 returned error can't find the container with id ae04ef6659973c8881039a413d05c2718d76ff56a1cd1498b683f705f80a111e Sep 29 09:52:30 crc kubenswrapper[4891]: I0929 09:52:30.739266 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j656v"] Sep 29 09:52:30 crc kubenswrapper[4891]: W0929 09:52:30.755105 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d92c0d6_f7a3_4ff2_9efb_400745c4f7fe.slice/crio-405ca949b28b45319844a9e1f59b6e80d520f494ec8d9f546b5c13711df8eba0 WatchSource:0}: Error finding container 405ca949b28b45319844a9e1f59b6e80d520f494ec8d9f546b5c13711df8eba0: Status 404 returned error can't find the container with id 405ca949b28b45319844a9e1f59b6e80d520f494ec8d9f546b5c13711df8eba0 Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.066904 4891 generic.go:334] "Generic (PLEG): container finished" podID="33c8f323-70e1-4e60-aeb7-f512c245885e" containerID="c229803dbc245da108a5f2d3b6a965bbf3d18e1b3d53067ce97f1a8d7fac2aaf" exitCode=0 Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.066964 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5txtz" event={"ID":"33c8f323-70e1-4e60-aeb7-f512c245885e","Type":"ContainerDied","Data":"c229803dbc245da108a5f2d3b6a965bbf3d18e1b3d53067ce97f1a8d7fac2aaf"} Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.069283 4891 generic.go:334] "Generic (PLEG): container finished" podID="c751fcd1-3522-4572-a3f1-52acfab7c45d" containerID="388f12f36685c658e3ae4564c181f695f8c94eeff53af7e7de49ce9c8fb2843a" exitCode=0 Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.069340 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxqvf" event={"ID":"c751fcd1-3522-4572-a3f1-52acfab7c45d","Type":"ContainerDied","Data":"388f12f36685c658e3ae4564c181f695f8c94eeff53af7e7de49ce9c8fb2843a"} Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.072035 4891 generic.go:334] "Generic (PLEG): container finished" podID="5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe" containerID="2ed5109ab897f4a2065c9dfb638beb069dfbb31d1b044ffff94c2018362a9b52" exitCode=0 Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.072104 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j656v" event={"ID":"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe","Type":"ContainerDied","Data":"2ed5109ab897f4a2065c9dfb638beb069dfbb31d1b044ffff94c2018362a9b52"} Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.072131 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j656v" event={"ID":"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe","Type":"ContainerStarted","Data":"405ca949b28b45319844a9e1f59b6e80d520f494ec8d9f546b5c13711df8eba0"} Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.074611 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c2eff33-cdda-491e-a057-a6b1e0a2bd10" containerID="9f05eea26b9a8b8464c79abd1be381a73b2abeb9ae7a2c6e04833cd56107400d" exitCode=0 Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.074727 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerDied","Data":"9f05eea26b9a8b8464c79abd1be381a73b2abeb9ae7a2c6e04833cd56107400d"} Sep 29 09:52:31 crc kubenswrapper[4891]: I0929 09:52:31.074757 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerStarted","Data":"ae04ef6659973c8881039a413d05c2718d76ff56a1cd1498b683f705f80a111e"} Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.082741 4891 generic.go:334] "Generic (PLEG): container finished" podID="5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe" containerID="a4d485f6558c7a7e5ca5c4a57a5d0a0fd7f532f0a0cedb2d2b55ead67087c78c" exitCode=0 Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.082937 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j656v" event={"ID":"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe","Type":"ContainerDied","Data":"a4d485f6558c7a7e5ca5c4a57a5d0a0fd7f532f0a0cedb2d2b55ead67087c78c"} Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.086733 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerStarted","Data":"e4d0ec582572cb473093b9135646d51b37ddaf22299a23c14bbdf128428f881e"} Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.091426 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5txtz" event={"ID":"33c8f323-70e1-4e60-aeb7-f512c245885e","Type":"ContainerStarted","Data":"2e05ee11a8b2ec58d9a53c3ec81d02883069a032f2725f6fc8bfaf9f4c0c1367"} Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.093942 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxqvf" event={"ID":"c751fcd1-3522-4572-a3f1-52acfab7c45d","Type":"ContainerStarted","Data":"466580be29ba8b62e19b322ecb578c56395a9357998af00b644e6bfb518a7b57"} Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.161721 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxqvf" podStartSLOduration=2.716903173 podStartE2EDuration="5.161691271s" podCreationTimestamp="2025-09-29 09:52:27 +0000 UTC" firstStartedPulling="2025-09-29 09:52:29.050661343 +0000 UTC m=+279.255829664" lastFinishedPulling="2025-09-29 09:52:31.495449431 +0000 UTC m=+281.700617762" observedRunningTime="2025-09-29 09:52:32.158038671 +0000 UTC m=+282.363207002" watchObservedRunningTime="2025-09-29 09:52:32.161691271 +0000 UTC m=+282.366859592" Sep 29 09:52:32 crc kubenswrapper[4891]: I0929 09:52:32.177936 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5txtz" podStartSLOduration=2.67514235 podStartE2EDuration="5.177909122s" podCreationTimestamp="2025-09-29 09:52:27 +0000 UTC" firstStartedPulling="2025-09-29 09:52:29.049769166 +0000 UTC m=+279.254937487" lastFinishedPulling="2025-09-29 09:52:31.552535938 +0000 UTC m=+281.757704259" observedRunningTime="2025-09-29 09:52:32.177480579 +0000 UTC m=+282.382648920" watchObservedRunningTime="2025-09-29 09:52:32.177909122 +0000 UTC m=+282.383077443" Sep 29 09:52:33 crc kubenswrapper[4891]: I0929 09:52:33.103312 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c2eff33-cdda-491e-a057-a6b1e0a2bd10" containerID="e4d0ec582572cb473093b9135646d51b37ddaf22299a23c14bbdf128428f881e" exitCode=0 Sep 29 09:52:33 crc kubenswrapper[4891]: I0929 09:52:33.103408 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerDied","Data":"e4d0ec582572cb473093b9135646d51b37ddaf22299a23c14bbdf128428f881e"} Sep 29 09:52:34 crc kubenswrapper[4891]: I0929 09:52:34.113515 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j656v" event={"ID":"5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe","Type":"ContainerStarted","Data":"4a7c9dd08f9aa672f0595460e1283814883d53686fd19f6bc7aaccb7e2e72dca"} Sep 29 09:52:34 crc kubenswrapper[4891]: I0929 09:52:34.134338 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j656v" podStartSLOduration=2.670799127 podStartE2EDuration="5.134307132s" podCreationTimestamp="2025-09-29 09:52:29 +0000 UTC" firstStartedPulling="2025-09-29 09:52:31.073959597 +0000 UTC m=+281.279127918" lastFinishedPulling="2025-09-29 09:52:33.537467602 +0000 UTC m=+283.742635923" observedRunningTime="2025-09-29 09:52:34.13257849 +0000 UTC m=+284.337746831" watchObservedRunningTime="2025-09-29 09:52:34.134307132 +0000 UTC m=+284.339475463" Sep 29 09:52:35 crc kubenswrapper[4891]: I0929 09:52:35.122544 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerStarted","Data":"21c28b138e5f0fe7c754537e30c5308cd9442f2dec2a7b504ee92b3ff0ef252f"} Sep 29 09:52:35 crc kubenswrapper[4891]: I0929 09:52:35.159999 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sn22t" podStartSLOduration=2.143399326 podStartE2EDuration="5.159973557s" podCreationTimestamp="2025-09-29 09:52:30 +0000 UTC" firstStartedPulling="2025-09-29 09:52:31.076508924 +0000 UTC m=+281.281677245" lastFinishedPulling="2025-09-29 09:52:34.093083155 +0000 UTC m=+284.298251476" observedRunningTime="2025-09-29 09:52:35.150604614 +0000 UTC m=+285.355772945" watchObservedRunningTime="2025-09-29 09:52:35.159973557 +0000 UTC m=+285.365141878" Sep 29 09:52:37 crc kubenswrapper[4891]: I0929 09:52:37.845937 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:37 crc kubenswrapper[4891]: I0929 09:52:37.846347 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:37 crc kubenswrapper[4891]: I0929 09:52:37.900433 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:38 crc kubenswrapper[4891]: I0929 09:52:38.103732 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:38 crc kubenswrapper[4891]: I0929 09:52:38.103783 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:38 crc kubenswrapper[4891]: I0929 09:52:38.152090 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:38 crc kubenswrapper[4891]: I0929 09:52:38.183219 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxqvf" Sep 29 09:52:38 crc kubenswrapper[4891]: I0929 09:52:38.203896 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5txtz" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.318924 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.319312 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.367665 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.449085 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.449277 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:40 crc kubenswrapper[4891]: I0929 09:52:40.519323 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:41 crc kubenswrapper[4891]: I0929 09:52:41.203725 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 09:52:41 crc kubenswrapper[4891]: I0929 09:52:41.208938 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j656v" Sep 29 09:54:06 crc kubenswrapper[4891]: I0929 09:54:06.186511 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:54:06 crc kubenswrapper[4891]: I0929 09:54:06.187195 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:54:36 crc kubenswrapper[4891]: I0929 09:54:36.186394 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:54:36 crc kubenswrapper[4891]: I0929 09:54:36.187016 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:55:06 crc kubenswrapper[4891]: I0929 09:55:06.186665 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:55:06 crc kubenswrapper[4891]: I0929 09:55:06.187328 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:55:06 crc kubenswrapper[4891]: I0929 09:55:06.187389 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:55:06 crc kubenswrapper[4891]: I0929 09:55:06.188087 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:55:06 crc kubenswrapper[4891]: I0929 09:55:06.188169 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6" gracePeriod=600 Sep 29 09:55:07 crc kubenswrapper[4891]: I0929 09:55:07.018736 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6" exitCode=0 Sep 29 09:55:07 crc kubenswrapper[4891]: I0929 09:55:07.018873 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6"} Sep 29 09:55:07 crc kubenswrapper[4891]: I0929 09:55:07.019340 4891 scope.go:117] "RemoveContainer" containerID="91afd0d56169c1f360c57ceb97957bc48e79615ded802e7f78b8bcb6939d55b3" Sep 29 09:55:08 crc kubenswrapper[4891]: I0929 09:55:08.027510 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558"} Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.745120 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zd674"] Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.746744 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.756856 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zd674"] Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824562 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-trusted-ca\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824621 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824649 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-certificates\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824680 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bfe0946-52ce-4fb7-94df-e7b619151de1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824698 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bfe0946-52ce-4fb7-94df-e7b619151de1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824819 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pdnp\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-kube-api-access-4pdnp\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824861 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-tls\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.824933 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-bound-sa-token\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.852300 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.925941 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bfe0946-52ce-4fb7-94df-e7b619151de1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.925993 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bfe0946-52ce-4fb7-94df-e7b619151de1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926013 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pdnp\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-kube-api-access-4pdnp\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926038 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-tls\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926077 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-bound-sa-token\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926101 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-trusted-ca\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926128 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-certificates\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.926586 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bfe0946-52ce-4fb7-94df-e7b619151de1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.927390 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-certificates\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.927939 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bfe0946-52ce-4fb7-94df-e7b619151de1-trusted-ca\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.934872 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-registry-tls\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.935306 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bfe0946-52ce-4fb7-94df-e7b619151de1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.943124 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-bound-sa-token\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:10 crc kubenswrapper[4891]: I0929 09:56:10.943993 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pdnp\" (UniqueName: \"kubernetes.io/projected/9bfe0946-52ce-4fb7-94df-e7b619151de1-kube-api-access-4pdnp\") pod \"image-registry-66df7c8f76-zd674\" (UID: \"9bfe0946-52ce-4fb7-94df-e7b619151de1\") " pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:11 crc kubenswrapper[4891]: I0929 09:56:11.062081 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:11 crc kubenswrapper[4891]: I0929 09:56:11.237143 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zd674"] Sep 29 09:56:11 crc kubenswrapper[4891]: I0929 09:56:11.380657 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" event={"ID":"9bfe0946-52ce-4fb7-94df-e7b619151de1","Type":"ContainerStarted","Data":"f0442a7412be756b0aa531a41a40d1113b0bac0af3dafdd658fa37719992cac7"} Sep 29 09:56:12 crc kubenswrapper[4891]: I0929 09:56:12.386902 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" event={"ID":"9bfe0946-52ce-4fb7-94df-e7b619151de1","Type":"ContainerStarted","Data":"f576729d9d882094c893b382abc25d29291bda77df3bff317abbd845e5bb9b33"} Sep 29 09:56:12 crc kubenswrapper[4891]: I0929 09:56:12.387321 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:12 crc kubenswrapper[4891]: I0929 09:56:12.405005 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" podStartSLOduration=2.404985996 podStartE2EDuration="2.404985996s" podCreationTimestamp="2025-09-29 09:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:56:12.403631124 +0000 UTC m=+502.608799445" watchObservedRunningTime="2025-09-29 09:56:12.404985996 +0000 UTC m=+502.610154327" Sep 29 09:56:31 crc kubenswrapper[4891]: I0929 09:56:31.081317 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zd674" Sep 29 09:56:31 crc kubenswrapper[4891]: I0929 09:56:31.132393 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:56:50 crc kubenswrapper[4891]: I0929 09:56:50.588925 4891 scope.go:117] "RemoveContainer" containerID="f558300f7aebf05a5e20e043703bfc9cd648d29cb1554342a8eee588fe07203b" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.174081 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" podUID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" containerName="registry" containerID="cri-o://619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321" gracePeriod=30 Sep 29 09:56:56 crc kubenswrapper[4891]: E0929 09:56:56.278430 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b1d0f6_a8a5_4644_9e05_614ccbfffd2d.slice/crio-619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321.scope\": RecentStats: unable to find data in memory cache]" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.496163 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.543315 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.543446 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.543473 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxnw4\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.543540 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.544664 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.544698 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.544781 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.544879 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted\") pod \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\" (UID: \"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d\") " Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.545449 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.545505 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.550913 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.553417 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4" (OuterVolumeSpecName: "kube-api-access-wxnw4") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "kube-api-access-wxnw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.553907 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.554077 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.557574 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.568375 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" (UID: "23b1d0f6-a8a5-4644-9e05-614ccbfffd2d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646348 4891 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646379 4891 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646390 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646401 4891 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646413 4891 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646424 4891 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.646434 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxnw4\" (UniqueName: \"kubernetes.io/projected/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d-kube-api-access-wxnw4\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.656887 4891 generic.go:334] "Generic (PLEG): container finished" podID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" containerID="619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321" exitCode=0 Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.656931 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" event={"ID":"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d","Type":"ContainerDied","Data":"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321"} Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.656972 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" event={"ID":"23b1d0f6-a8a5-4644-9e05-614ccbfffd2d","Type":"ContainerDied","Data":"d4c7af6e94172f4a4fe8530aba7b6469829f2db8797abe7fd5eafc038b08193d"} Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.656994 4891 scope.go:117] "RemoveContainer" containerID="619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.656989 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mgnzm" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.679505 4891 scope.go:117] "RemoveContainer" containerID="619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321" Sep 29 09:56:56 crc kubenswrapper[4891]: E0929 09:56:56.680049 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321\": container with ID starting with 619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321 not found: ID does not exist" containerID="619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.680083 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321"} err="failed to get container status \"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321\": rpc error: code = NotFound desc = could not find container \"619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321\": container with ID starting with 619a3d2a03b3780164677fa9e4c19ed0c57b09ebcd723f695a1f8552414ed321 not found: ID does not exist" Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.686912 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:56:56 crc kubenswrapper[4891]: I0929 09:56:56.690294 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mgnzm"] Sep 29 09:56:58 crc kubenswrapper[4891]: I0929 09:56:58.402911 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" path="/var/lib/kubelet/pods/23b1d0f6-a8a5-4644-9e05-614ccbfffd2d/volumes" Sep 29 09:57:36 crc kubenswrapper[4891]: I0929 09:57:36.185659 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:57:36 crc kubenswrapper[4891]: I0929 09:57:36.186355 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:57:50 crc kubenswrapper[4891]: I0929 09:57:50.613122 4891 scope.go:117] "RemoveContainer" containerID="8b897984eb7db0fc90509244e0de54bf9304241e1265e56c421053b09d09ea73" Sep 29 09:57:50 crc kubenswrapper[4891]: I0929 09:57:50.628959 4891 scope.go:117] "RemoveContainer" containerID="840b58f75368f59e243db870167fd57d9c2dde0eb3719b186e8b7fb5fd485d07" Sep 29 09:58:06 crc kubenswrapper[4891]: I0929 09:58:06.186442 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:58:06 crc kubenswrapper[4891]: I0929 09:58:06.187491 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.261906 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-dtxr6"] Sep 29 09:58:11 crc kubenswrapper[4891]: E0929 09:58:11.262609 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" containerName="registry" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.262626 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" containerName="registry" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.262716 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b1d0f6-a8a5-4644-9e05-614ccbfffd2d" containerName="registry" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.263225 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.265536 4891 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6vzsg" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.266125 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.266211 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.272440 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxtv8"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.273306 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xxtv8" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.275600 4891 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-24nwl" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.293039 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxtv8"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.296103 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sf94x"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.300717 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.306022 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sf94x"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.310152 4891 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-d9btn" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.319440 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-dtxr6"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.354608 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmrt\" (UniqueName: \"kubernetes.io/projected/32062242-b85f-4c38-a6dd-5701216a7a26-kube-api-access-ltmrt\") pod \"cert-manager-cainjector-7f985d654d-dtxr6\" (UID: \"32062242-b85f-4c38-a6dd-5701216a7a26\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.355019 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vclhb\" (UniqueName: \"kubernetes.io/projected/10df5ae8-eb89-4efd-8877-6a87a962fbe7-kube-api-access-vclhb\") pod \"cert-manager-5b446d88c5-xxtv8\" (UID: \"10df5ae8-eb89-4efd-8877-6a87a962fbe7\") " pod="cert-manager/cert-manager-5b446d88c5-xxtv8" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.355282 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbwr\" (UniqueName: \"kubernetes.io/projected/b1de57da-fef3-4c24-a501-7f14e9973be9-kube-api-access-9cbwr\") pod \"cert-manager-webhook-5655c58dd6-sf94x\" (UID: \"b1de57da-fef3-4c24-a501-7f14e9973be9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.456683 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vclhb\" (UniqueName: \"kubernetes.io/projected/10df5ae8-eb89-4efd-8877-6a87a962fbe7-kube-api-access-vclhb\") pod \"cert-manager-5b446d88c5-xxtv8\" (UID: \"10df5ae8-eb89-4efd-8877-6a87a962fbe7\") " pod="cert-manager/cert-manager-5b446d88c5-xxtv8" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.456782 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbwr\" (UniqueName: \"kubernetes.io/projected/b1de57da-fef3-4c24-a501-7f14e9973be9-kube-api-access-9cbwr\") pod \"cert-manager-webhook-5655c58dd6-sf94x\" (UID: \"b1de57da-fef3-4c24-a501-7f14e9973be9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.456905 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmrt\" (UniqueName: \"kubernetes.io/projected/32062242-b85f-4c38-a6dd-5701216a7a26-kube-api-access-ltmrt\") pod \"cert-manager-cainjector-7f985d654d-dtxr6\" (UID: \"32062242-b85f-4c38-a6dd-5701216a7a26\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.479459 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vclhb\" (UniqueName: \"kubernetes.io/projected/10df5ae8-eb89-4efd-8877-6a87a962fbe7-kube-api-access-vclhb\") pod \"cert-manager-5b446d88c5-xxtv8\" (UID: \"10df5ae8-eb89-4efd-8877-6a87a962fbe7\") " pod="cert-manager/cert-manager-5b446d88c5-xxtv8" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.479467 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmrt\" (UniqueName: \"kubernetes.io/projected/32062242-b85f-4c38-a6dd-5701216a7a26-kube-api-access-ltmrt\") pod \"cert-manager-cainjector-7f985d654d-dtxr6\" (UID: \"32062242-b85f-4c38-a6dd-5701216a7a26\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.482505 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbwr\" (UniqueName: \"kubernetes.io/projected/b1de57da-fef3-4c24-a501-7f14e9973be9-kube-api-access-9cbwr\") pod \"cert-manager-webhook-5655c58dd6-sf94x\" (UID: \"b1de57da-fef3-4c24-a501-7f14e9973be9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.583064 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.591534 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xxtv8" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.621735 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.819263 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxtv8"] Sep 29 09:58:11 crc kubenswrapper[4891]: I0929 09:58:11.828559 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 09:58:12 crc kubenswrapper[4891]: I0929 09:58:12.088871 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xxtv8" event={"ID":"10df5ae8-eb89-4efd-8877-6a87a962fbe7","Type":"ContainerStarted","Data":"34ed2f9827726ff30b40a496f7a435a6259727df4f829c6d075eb0b90474f95a"} Sep 29 09:58:12 crc kubenswrapper[4891]: I0929 09:58:12.125878 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-dtxr6"] Sep 29 09:58:12 crc kubenswrapper[4891]: W0929 09:58:12.128846 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32062242_b85f_4c38_a6dd_5701216a7a26.slice/crio-505ab4217603929ea8f7337338049dabe3bb2f968036961e98c658e8da3a65ea WatchSource:0}: Error finding container 505ab4217603929ea8f7337338049dabe3bb2f968036961e98c658e8da3a65ea: Status 404 returned error can't find the container with id 505ab4217603929ea8f7337338049dabe3bb2f968036961e98c658e8da3a65ea Sep 29 09:58:12 crc kubenswrapper[4891]: I0929 09:58:12.129690 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sf94x"] Sep 29 09:58:12 crc kubenswrapper[4891]: W0929 09:58:12.133106 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1de57da_fef3_4c24_a501_7f14e9973be9.slice/crio-67e8a5a2dcb56df79d1ca72eeb508e39f12e1ad6dc38c90db6e6f3863ca8e861 WatchSource:0}: Error finding container 67e8a5a2dcb56df79d1ca72eeb508e39f12e1ad6dc38c90db6e6f3863ca8e861: Status 404 returned error can't find the container with id 67e8a5a2dcb56df79d1ca72eeb508e39f12e1ad6dc38c90db6e6f3863ca8e861 Sep 29 09:58:13 crc kubenswrapper[4891]: I0929 09:58:13.095852 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" event={"ID":"b1de57da-fef3-4c24-a501-7f14e9973be9","Type":"ContainerStarted","Data":"67e8a5a2dcb56df79d1ca72eeb508e39f12e1ad6dc38c90db6e6f3863ca8e861"} Sep 29 09:58:13 crc kubenswrapper[4891]: I0929 09:58:13.098028 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" event={"ID":"32062242-b85f-4c38-a6dd-5701216a7a26","Type":"ContainerStarted","Data":"505ab4217603929ea8f7337338049dabe3bb2f968036961e98c658e8da3a65ea"} Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.117618 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" event={"ID":"b1de57da-fef3-4c24-a501-7f14e9973be9","Type":"ContainerStarted","Data":"8c173ce2a959e1811d60a5c3cadab9b934d7d835d68e195db3696c82feebcfef"} Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.118188 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.119204 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" event={"ID":"32062242-b85f-4c38-a6dd-5701216a7a26","Type":"ContainerStarted","Data":"921c4bc5750f920ad235bc9018468d915438ca0eba1a4bd8829a3280bbc1030d"} Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.121252 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xxtv8" event={"ID":"10df5ae8-eb89-4efd-8877-6a87a962fbe7","Type":"ContainerStarted","Data":"1d10bbb1d8a181128b63790c24b87614520049592746c0f16e8a23e9e13a63e9"} Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.135506 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" podStartSLOduration=1.573950714 podStartE2EDuration="5.135477767s" podCreationTimestamp="2025-09-29 09:58:11 +0000 UTC" firstStartedPulling="2025-09-29 09:58:12.135277298 +0000 UTC m=+622.340445619" lastFinishedPulling="2025-09-29 09:58:15.696804351 +0000 UTC m=+625.901972672" observedRunningTime="2025-09-29 09:58:16.133508858 +0000 UTC m=+626.338677179" watchObservedRunningTime="2025-09-29 09:58:16.135477767 +0000 UTC m=+626.340646108" Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.166595 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-dtxr6" podStartSLOduration=1.526408671 podStartE2EDuration="5.166573724s" podCreationTimestamp="2025-09-29 09:58:11 +0000 UTC" firstStartedPulling="2025-09-29 09:58:12.131163884 +0000 UTC m=+622.336332195" lastFinishedPulling="2025-09-29 09:58:15.771328927 +0000 UTC m=+625.976497248" observedRunningTime="2025-09-29 09:58:16.153051256 +0000 UTC m=+626.358219577" watchObservedRunningTime="2025-09-29 09:58:16.166573724 +0000 UTC m=+626.371742045" Sep 29 09:58:16 crc kubenswrapper[4891]: I0929 09:58:16.168126 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xxtv8" podStartSLOduration=1.30035195 podStartE2EDuration="5.16812084s" podCreationTimestamp="2025-09-29 09:58:11 +0000 UTC" firstStartedPulling="2025-09-29 09:58:11.828215347 +0000 UTC m=+622.033383668" lastFinishedPulling="2025-09-29 09:58:15.695984247 +0000 UTC m=+625.901152558" observedRunningTime="2025-09-29 09:58:16.164500851 +0000 UTC m=+626.369669162" watchObservedRunningTime="2025-09-29 09:58:16.16812084 +0000 UTC m=+626.373289161" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.521238 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fs6qf"] Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522129 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-controller" containerID="cri-o://7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522165 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="northd" containerID="cri-o://2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522206 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-node" containerID="cri-o://0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522299 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="sbdb" containerID="cri-o://9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522370 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522367 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="nbdb" containerID="cri-o://9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.522334 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-acl-logging" containerID="cri-o://a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.577449 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" containerID="cri-o://911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" gracePeriod=30 Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.625341 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-sf94x" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.890749 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/3.log" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.894747 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovn-acl-logging/0.log" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.895330 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovn-controller/0.log" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.895857 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.950653 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x7dv2"] Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.950919 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.950943 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.950958 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-node" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.950968 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-node" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.950979 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="northd" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.950986 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="northd" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.950993 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951000 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951012 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951020 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951031 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951038 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951050 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-acl-logging" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951057 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-acl-logging" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951068 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="sbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951075 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="sbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951085 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kubecfg-setup" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951091 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kubecfg-setup" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951100 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951107 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951119 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951127 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951137 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="nbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951145 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="nbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: E0929 09:58:21.951155 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951162 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951280 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951292 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951304 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951315 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951323 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="kube-rbac-proxy-node" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951558 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="sbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951571 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="northd" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951581 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovn-acl-logging" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951594 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="nbdb" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951602 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.951838 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.952364 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerName="ovnkube-controller" Sep 29 09:58:21 crc kubenswrapper[4891]: I0929 09:58:21.954601 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.001946 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.001993 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002013 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002061 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002087 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002084 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002118 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jbx5\" (UniqueName: \"kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002139 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002164 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002135 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002150 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002147 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002182 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002242 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log" (OuterVolumeSpecName: "node-log") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002308 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002254 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002166 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash" (OuterVolumeSpecName: "host-slash") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002315 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002375 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket" (OuterVolumeSpecName: "log-socket") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002376 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002342 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002437 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002466 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002493 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002543 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002564 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002590 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002613 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002644 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002670 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002678 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert\") pod \"01bb1c54-d2f0-498e-ad60-8216c29b843d\" (UID: \"01bb1c54-d2f0-498e-ad60-8216c29b843d\") " Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002588 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002944 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-kubelet\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002969 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-systemd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002991 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003013 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-bin\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003050 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-netns\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.002759 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003086 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003061 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003114 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003283 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003080 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-script-lib\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003425 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbtt\" (UniqueName: \"kubernetes.io/projected/61de0b8d-5a50-451a-a0bf-16643a3e6288-kube-api-access-fkbtt\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003499 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-slash\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003525 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003645 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-ovn\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003776 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-config\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003883 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-node-log\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.003989 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-env-overrides\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004072 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004215 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-var-lib-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004268 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-etc-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004465 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-log-socket\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004546 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-systemd-units\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004605 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-netd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004628 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovn-node-metrics-cert\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004705 4891 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-log-socket\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004722 4891 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004733 4891 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004743 4891 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004754 4891 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004766 4891 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004778 4891 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004804 4891 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004814 4891 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004823 4891 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004833 4891 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004844 4891 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-slash\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004856 4891 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004869 4891 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004878 4891 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004887 4891 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-node-log\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.004896 4891 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.009006 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5" (OuterVolumeSpecName: "kube-api-access-4jbx5") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "kube-api-access-4jbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.009506 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.017678 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "01bb1c54-d2f0-498e-ad60-8216c29b843d" (UID: "01bb1c54-d2f0-498e-ad60-8216c29b843d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105832 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-netns\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105883 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-script-lib\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105905 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbtt\" (UniqueName: \"kubernetes.io/projected/61de0b8d-5a50-451a-a0bf-16643a3e6288-kube-api-access-fkbtt\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105937 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-slash\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105954 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105969 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-ovn\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.105985 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-config\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-node-log\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106027 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-env-overrides\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106047 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106073 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-var-lib-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106094 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-etc-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106115 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-log-socket\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106135 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-systemd-units\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106158 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-netd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106172 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovn-node-metrics-cert\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106185 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-kubelet\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106202 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-systemd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106217 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106231 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-bin\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106267 4891 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01bb1c54-d2f0-498e-ad60-8216c29b843d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106280 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jbx5\" (UniqueName: \"kubernetes.io/projected/01bb1c54-d2f0-498e-ad60-8216c29b843d-kube-api-access-4jbx5\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106290 4891 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01bb1c54-d2f0-498e-ad60-8216c29b843d-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106361 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-bin\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.106400 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-netns\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107246 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-script-lib\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107314 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-var-lib-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107351 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-slash\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107381 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-etc-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107425 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107463 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-ovn\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107476 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-kubelet\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107552 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-systemd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107608 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-run-openvswitch\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.107928 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-log-socket\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.108007 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-systemd-units\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.108036 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-cni-netd\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.108062 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-node-log\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.108087 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61de0b8d-5a50-451a-a0bf-16643a3e6288-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.108267 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-env-overrides\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.109090 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovnkube-config\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.110770 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61de0b8d-5a50-451a-a0bf-16643a3e6288-ovn-node-metrics-cert\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.124221 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbtt\" (UniqueName: \"kubernetes.io/projected/61de0b8d-5a50-451a-a0bf-16643a3e6288-kube-api-access-fkbtt\") pod \"ovnkube-node-x7dv2\" (UID: \"61de0b8d-5a50-451a-a0bf-16643a3e6288\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.167469 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovnkube-controller/3.log" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.170200 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovn-acl-logging/0.log" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171063 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fs6qf_01bb1c54-d2f0-498e-ad60-8216c29b843d/ovn-controller/0.log" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171605 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171644 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171658 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171670 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171681 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171693 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" exitCode=0 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171699 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171705 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" exitCode=143 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171718 4891 generic.go:334] "Generic (PLEG): container finished" podID="01bb1c54-d2f0-498e-ad60-8216c29b843d" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" exitCode=143 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171708 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171856 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171874 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171886 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171897 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171909 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171927 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171942 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171948 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171954 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171960 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171966 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171972 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171978 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171984 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171991 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172002 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172010 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172015 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172021 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172026 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172032 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172037 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172043 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172051 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172057 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172064 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172075 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172082 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172088 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172094 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172100 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172107 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172113 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172118 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172124 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172129 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.171967 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172137 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fs6qf" event={"ID":"01bb1c54-d2f0-498e-ad60-8216c29b843d","Type":"ContainerDied","Data":"5a3c4615a682515d0b2ce56502f35bd34b59275f7787593663d716d848591f3a"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172285 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172308 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172315 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172320 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172325 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172331 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172336 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172341 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172346 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.172351 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.175986 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/2.log" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.176578 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/1.log" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.176617 4891 generic.go:334] "Generic (PLEG): container finished" podID="4bfce090-366c-43be-ab12-d291b4d25217" containerID="bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3" exitCode=2 Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.176645 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerDied","Data":"bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.176664 4891 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8"} Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.177176 4891 scope.go:117] "RemoveContainer" containerID="bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.177464 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ngmm4_openshift-multus(4bfce090-366c-43be-ab12-d291b4d25217)\"" pod="openshift-multus/multus-ngmm4" podUID="4bfce090-366c-43be-ab12-d291b4d25217" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.213374 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.229676 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fs6qf"] Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.233749 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fs6qf"] Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.247616 4891 scope.go:117] "RemoveContainer" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.264211 4891 scope.go:117] "RemoveContainer" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.270103 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.300112 4891 scope.go:117] "RemoveContainer" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.320751 4891 scope.go:117] "RemoveContainer" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.340047 4891 scope.go:117] "RemoveContainer" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.354758 4891 scope.go:117] "RemoveContainer" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.377866 4891 scope.go:117] "RemoveContainer" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.394922 4891 scope.go:117] "RemoveContainer" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.403764 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bb1c54-d2f0-498e-ad60-8216c29b843d" path="/var/lib/kubelet/pods/01bb1c54-d2f0-498e-ad60-8216c29b843d/volumes" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.413351 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.414214 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.414274 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} err="failed to get container status \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.414315 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.415085 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": container with ID starting with 7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef not found: ID does not exist" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.415127 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} err="failed to get container status \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": rpc error: code = NotFound desc = could not find container \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": container with ID starting with 7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.415162 4891 scope.go:117] "RemoveContainer" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.415682 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": container with ID starting with 9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f not found: ID does not exist" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.415751 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} err="failed to get container status \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": rpc error: code = NotFound desc = could not find container \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": container with ID starting with 9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.415810 4891 scope.go:117] "RemoveContainer" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.416253 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": container with ID starting with 9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa not found: ID does not exist" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.416280 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} err="failed to get container status \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": rpc error: code = NotFound desc = could not find container \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": container with ID starting with 9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.416311 4891 scope.go:117] "RemoveContainer" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.416652 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": container with ID starting with 2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535 not found: ID does not exist" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.416689 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} err="failed to get container status \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": rpc error: code = NotFound desc = could not find container \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": container with ID starting with 2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.416705 4891 scope.go:117] "RemoveContainer" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.417003 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": container with ID starting with 7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2 not found: ID does not exist" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417041 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} err="failed to get container status \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": rpc error: code = NotFound desc = could not find container \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": container with ID starting with 7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417059 4891 scope.go:117] "RemoveContainer" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.417367 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": container with ID starting with 0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d not found: ID does not exist" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417396 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} err="failed to get container status \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": rpc error: code = NotFound desc = could not find container \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": container with ID starting with 0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417415 4891 scope.go:117] "RemoveContainer" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.417837 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": container with ID starting with a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598 not found: ID does not exist" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417864 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} err="failed to get container status \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": rpc error: code = NotFound desc = could not find container \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": container with ID starting with a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.417879 4891 scope.go:117] "RemoveContainer" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.418165 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": container with ID starting with 7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad not found: ID does not exist" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.418203 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} err="failed to get container status \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": rpc error: code = NotFound desc = could not find container \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": container with ID starting with 7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.418229 4891 scope.go:117] "RemoveContainer" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: E0929 09:58:22.418774 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": container with ID starting with a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a not found: ID does not exist" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.418847 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} err="failed to get container status \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": rpc error: code = NotFound desc = could not find container \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": container with ID starting with a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.418868 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.419178 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} err="failed to get container status \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.419202 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.419480 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} err="failed to get container status \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": rpc error: code = NotFound desc = could not find container \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": container with ID starting with 7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.419504 4891 scope.go:117] "RemoveContainer" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420014 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} err="failed to get container status \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": rpc error: code = NotFound desc = could not find container \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": container with ID starting with 9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420044 4891 scope.go:117] "RemoveContainer" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420318 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} err="failed to get container status \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": rpc error: code = NotFound desc = could not find container \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": container with ID starting with 9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420346 4891 scope.go:117] "RemoveContainer" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420548 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} err="failed to get container status \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": rpc error: code = NotFound desc = could not find container \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": container with ID starting with 2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.420577 4891 scope.go:117] "RemoveContainer" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.421700 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} err="failed to get container status \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": rpc error: code = NotFound desc = could not find container \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": container with ID starting with 7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.421728 4891 scope.go:117] "RemoveContainer" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422087 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} err="failed to get container status \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": rpc error: code = NotFound desc = could not find container \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": container with ID starting with 0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422111 4891 scope.go:117] "RemoveContainer" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422446 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} err="failed to get container status \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": rpc error: code = NotFound desc = could not find container \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": container with ID starting with a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422463 4891 scope.go:117] "RemoveContainer" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422681 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} err="failed to get container status \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": rpc error: code = NotFound desc = could not find container \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": container with ID starting with 7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.422706 4891 scope.go:117] "RemoveContainer" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423005 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} err="failed to get container status \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": rpc error: code = NotFound desc = could not find container \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": container with ID starting with a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423021 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423288 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} err="failed to get container status \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423307 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423509 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} err="failed to get container status \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": rpc error: code = NotFound desc = could not find container \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": container with ID starting with 7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.423527 4891 scope.go:117] "RemoveContainer" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.424082 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} err="failed to get container status \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": rpc error: code = NotFound desc = could not find container \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": container with ID starting with 9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.424123 4891 scope.go:117] "RemoveContainer" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.424613 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} err="failed to get container status \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": rpc error: code = NotFound desc = could not find container \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": container with ID starting with 9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.424639 4891 scope.go:117] "RemoveContainer" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.425103 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} err="failed to get container status \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": rpc error: code = NotFound desc = could not find container \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": container with ID starting with 2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.425130 4891 scope.go:117] "RemoveContainer" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.425590 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} err="failed to get container status \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": rpc error: code = NotFound desc = could not find container \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": container with ID starting with 7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.425620 4891 scope.go:117] "RemoveContainer" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.425993 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} err="failed to get container status \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": rpc error: code = NotFound desc = could not find container \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": container with ID starting with 0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.426021 4891 scope.go:117] "RemoveContainer" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.426347 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} err="failed to get container status \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": rpc error: code = NotFound desc = could not find container \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": container with ID starting with a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.426373 4891 scope.go:117] "RemoveContainer" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.426690 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} err="failed to get container status \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": rpc error: code = NotFound desc = could not find container \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": container with ID starting with 7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.426712 4891 scope.go:117] "RemoveContainer" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.427197 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} err="failed to get container status \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": rpc error: code = NotFound desc = could not find container \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": container with ID starting with a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.427230 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428014 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} err="failed to get container status \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428046 4891 scope.go:117] "RemoveContainer" containerID="7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428400 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef"} err="failed to get container status \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": rpc error: code = NotFound desc = could not find container \"7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef\": container with ID starting with 7135a331f3c074b06b956c8540eff078f757e8858be2e125284135b72175e5ef not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428425 4891 scope.go:117] "RemoveContainer" containerID="9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428741 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f"} err="failed to get container status \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": rpc error: code = NotFound desc = could not find container \"9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f\": container with ID starting with 9c8fe3f555b3b98351418a4a268007ac64efab96add6ff35502320983c034e6f not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.428764 4891 scope.go:117] "RemoveContainer" containerID="9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.429084 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa"} err="failed to get container status \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": rpc error: code = NotFound desc = could not find container \"9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa\": container with ID starting with 9b0efbad9d7dd7f5cbe7b3431f28a1cfdf2d25460f3702899d4c25e496c0aaaa not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.429219 4891 scope.go:117] "RemoveContainer" containerID="2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.429566 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535"} err="failed to get container status \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": rpc error: code = NotFound desc = could not find container \"2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535\": container with ID starting with 2ca71516d83f9722cedb89b2aa631529b6078372faf87ec4a5cbf101651f9535 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.429595 4891 scope.go:117] "RemoveContainer" containerID="7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.430050 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2"} err="failed to get container status \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": rpc error: code = NotFound desc = could not find container \"7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2\": container with ID starting with 7afa75b63501245b7f3aa2c79be7504225a44c1da08b1d641eb40e9bbbfc39e2 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.430071 4891 scope.go:117] "RemoveContainer" containerID="0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.430479 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d"} err="failed to get container status \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": rpc error: code = NotFound desc = could not find container \"0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d\": container with ID starting with 0a928a8b5e5769df675c27bd9cb78bd0d8eb4c459a2287ccce8e6c1eb066ea8d not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.430509 4891 scope.go:117] "RemoveContainer" containerID="a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431045 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598"} err="failed to get container status \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": rpc error: code = NotFound desc = could not find container \"a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598\": container with ID starting with a9864d612ad928200ddb1e903245f9bb75d0983f5b48919dede1f592c6a3c598 not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431075 4891 scope.go:117] "RemoveContainer" containerID="7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431432 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad"} err="failed to get container status \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": rpc error: code = NotFound desc = could not find container \"7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad\": container with ID starting with 7e5bd9f72ebaa02d52e906ba8311809fee8e64b435a116d92e374d10e045b5ad not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431456 4891 scope.go:117] "RemoveContainer" containerID="a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431850 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a"} err="failed to get container status \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": rpc error: code = NotFound desc = could not find container \"a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a\": container with ID starting with a60b2fb13d3bdf2fce7defbbdb85292f4c7e1da6a82a0e3fc84618969c48cc2a not found: ID does not exist" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.431951 4891 scope.go:117] "RemoveContainer" containerID="911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543" Sep 29 09:58:22 crc kubenswrapper[4891]: I0929 09:58:22.432293 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543"} err="failed to get container status \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": rpc error: code = NotFound desc = could not find container \"911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543\": container with ID starting with 911e340b6e3d65eef5cf08198c52d78530a2ee2ce9d7f1f35d4202d112350543 not found: ID does not exist" Sep 29 09:58:23 crc kubenswrapper[4891]: I0929 09:58:23.191363 4891 generic.go:334] "Generic (PLEG): container finished" podID="61de0b8d-5a50-451a-a0bf-16643a3e6288" containerID="56482ce4e95d15b4ca6e70c00c73129f0dca01142dadd53b266a197bfcee2fe1" exitCode=0 Sep 29 09:58:23 crc kubenswrapper[4891]: I0929 09:58:23.191428 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerDied","Data":"56482ce4e95d15b4ca6e70c00c73129f0dca01142dadd53b266a197bfcee2fe1"} Sep 29 09:58:23 crc kubenswrapper[4891]: I0929 09:58:23.191475 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"100ea57b8f39188f18f3584f7b9647ac6e0093527098404a889ad120261619df"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.202650 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"ac8fd59c07d49abdea1cfecce1832786c42a5e3dfe538daa8d86eec67f8c9a19"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.203265 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"bc98257759865b05c6ce907bed2dad76b16fd336368c4f9ebbcc4d8cbd6118bc"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.203279 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"ee68cf0913cd3f08d03bc21e9798bbccb8514aedcc07e67a9675fd21edd69c33"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.203287 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"4452b851aaa1f6db0ce8be8c2375e9edb8c1a67499fbb73c9fc86888eb746bff"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.203299 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"487d9065456e5b6cc4f3b9305104c895244133dbcc13d0ab075dd16ad1edabb4"} Sep 29 09:58:24 crc kubenswrapper[4891]: I0929 09:58:24.203310 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"0e3d6b815a90b21b2a401fbe31a11fb908e50dda7b28d938edebc9207b7ffe8a"} Sep 29 09:58:27 crc kubenswrapper[4891]: I0929 09:58:27.224922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"b7947d4de6722e6b3a1945bf5cca77ada979b74460dc850b816fa00032977643"} Sep 29 09:58:27 crc kubenswrapper[4891]: I0929 09:58:27.225356 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" event={"ID":"61de0b8d-5a50-451a-a0bf-16643a3e6288","Type":"ContainerStarted","Data":"a6f77a49c9a5305b0389bac5e8b9d7f05b91f8a00146255174cfc20df4b4aff1"} Sep 29 09:58:27 crc kubenswrapper[4891]: I0929 09:58:27.225371 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:27 crc kubenswrapper[4891]: I0929 09:58:27.251004 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" podStartSLOduration=6.250978978 podStartE2EDuration="6.250978978s" podCreationTimestamp="2025-09-29 09:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:58:27.249488303 +0000 UTC m=+637.454656634" watchObservedRunningTime="2025-09-29 09:58:27.250978978 +0000 UTC m=+637.456147319" Sep 29 09:58:27 crc kubenswrapper[4891]: I0929 09:58:27.256606 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:28 crc kubenswrapper[4891]: I0929 09:58:28.231069 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:28 crc kubenswrapper[4891]: I0929 09:58:28.231347 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:28 crc kubenswrapper[4891]: I0929 09:58:28.298034 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.186614 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.187652 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.187729 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.188668 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.188743 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558" gracePeriod=600 Sep 29 09:58:36 crc kubenswrapper[4891]: I0929 09:58:36.397184 4891 scope.go:117] "RemoveContainer" containerID="bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3" Sep 29 09:58:36 crc kubenswrapper[4891]: E0929 09:58:36.397539 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ngmm4_openshift-multus(4bfce090-366c-43be-ab12-d291b4d25217)\"" pod="openshift-multus/multus-ngmm4" podUID="4bfce090-366c-43be-ab12-d291b4d25217" Sep 29 09:58:37 crc kubenswrapper[4891]: I0929 09:58:37.288003 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558" exitCode=0 Sep 29 09:58:37 crc kubenswrapper[4891]: I0929 09:58:37.288086 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558"} Sep 29 09:58:37 crc kubenswrapper[4891]: I0929 09:58:37.288384 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20"} Sep 29 09:58:37 crc kubenswrapper[4891]: I0929 09:58:37.288412 4891 scope.go:117] "RemoveContainer" containerID="ad896143667aee79e9b59c715f3d34dab8dd50a3b2883d46a38afda965f786f6" Sep 29 09:58:49 crc kubenswrapper[4891]: I0929 09:58:49.396024 4891 scope.go:117] "RemoveContainer" containerID="bdecff15ea67cb3d37c875fefda1df1046957856884f5759f93b13e4612a9bf3" Sep 29 09:58:50 crc kubenswrapper[4891]: I0929 09:58:50.380411 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/2.log" Sep 29 09:58:50 crc kubenswrapper[4891]: I0929 09:58:50.382372 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/1.log" Sep 29 09:58:50 crc kubenswrapper[4891]: I0929 09:58:50.382483 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ngmm4" event={"ID":"4bfce090-366c-43be-ab12-d291b4d25217","Type":"ContainerStarted","Data":"10e429d0de6cc97a06d610ad5b7d13103958f07be80387abecfbd7507dd1fb52"} Sep 29 09:58:50 crc kubenswrapper[4891]: I0929 09:58:50.666207 4891 scope.go:117] "RemoveContainer" containerID="d37de03666ab8602a2a9d90c21788caee65748a0b3bdb4a81569c5bd05458aa8" Sep 29 09:58:51 crc kubenswrapper[4891]: I0929 09:58:51.390411 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ngmm4_4bfce090-366c-43be-ab12-d291b4d25217/kube-multus/2.log" Sep 29 09:58:52 crc kubenswrapper[4891]: I0929 09:58:52.289469 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7dv2" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.108135 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8"] Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.110269 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.112578 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.122035 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8"] Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.148417 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7bd\" (UniqueName: \"kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.148516 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.148565 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.250195 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.250263 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.250306 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7bd\" (UniqueName: \"kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.251158 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.251194 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.271880 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7bd\" (UniqueName: \"kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.492248 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:01 crc kubenswrapper[4891]: I0929 09:59:01.740837 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8"] Sep 29 09:59:02 crc kubenswrapper[4891]: I0929 09:59:02.462474 4891 generic.go:334] "Generic (PLEG): container finished" podID="011f8a2f-1062-4b33-8244-235930966cf1" containerID="78677396640bba0d9d981643e7d337e09321c9ba9d2194238c19cf5867a219e7" exitCode=0 Sep 29 09:59:02 crc kubenswrapper[4891]: I0929 09:59:02.462687 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerDied","Data":"78677396640bba0d9d981643e7d337e09321c9ba9d2194238c19cf5867a219e7"} Sep 29 09:59:02 crc kubenswrapper[4891]: I0929 09:59:02.462989 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerStarted","Data":"6d3606606cd9bf1b8de69bd340b4d677c9828fd75db935d303f092e1e99ee666"} Sep 29 09:59:04 crc kubenswrapper[4891]: I0929 09:59:04.479413 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerStarted","Data":"678dee4465fce0b50646fb984fc94168848bfb6244d16a07af789b785319d7ae"} Sep 29 09:59:05 crc kubenswrapper[4891]: I0929 09:59:05.490283 4891 generic.go:334] "Generic (PLEG): container finished" podID="011f8a2f-1062-4b33-8244-235930966cf1" containerID="678dee4465fce0b50646fb984fc94168848bfb6244d16a07af789b785319d7ae" exitCode=0 Sep 29 09:59:05 crc kubenswrapper[4891]: I0929 09:59:05.490352 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerDied","Data":"678dee4465fce0b50646fb984fc94168848bfb6244d16a07af789b785319d7ae"} Sep 29 09:59:06 crc kubenswrapper[4891]: I0929 09:59:06.501067 4891 generic.go:334] "Generic (PLEG): container finished" podID="011f8a2f-1062-4b33-8244-235930966cf1" containerID="c039f670cb26b78fb83147f7ac23e8225fc57d57d9f5bd042f20081f56d07e60" exitCode=0 Sep 29 09:59:06 crc kubenswrapper[4891]: I0929 09:59:06.501145 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerDied","Data":"c039f670cb26b78fb83147f7ac23e8225fc57d57d9f5bd042f20081f56d07e60"} Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.739715 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.837654 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle\") pod \"011f8a2f-1062-4b33-8244-235930966cf1\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.837771 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q7bd\" (UniqueName: \"kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd\") pod \"011f8a2f-1062-4b33-8244-235930966cf1\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.837829 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util\") pod \"011f8a2f-1062-4b33-8244-235930966cf1\" (UID: \"011f8a2f-1062-4b33-8244-235930966cf1\") " Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.840428 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle" (OuterVolumeSpecName: "bundle") pod "011f8a2f-1062-4b33-8244-235930966cf1" (UID: "011f8a2f-1062-4b33-8244-235930966cf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.856482 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util" (OuterVolumeSpecName: "util") pod "011f8a2f-1062-4b33-8244-235930966cf1" (UID: "011f8a2f-1062-4b33-8244-235930966cf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.864167 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd" (OuterVolumeSpecName: "kube-api-access-5q7bd") pod "011f8a2f-1062-4b33-8244-235930966cf1" (UID: "011f8a2f-1062-4b33-8244-235930966cf1"). InnerVolumeSpecName "kube-api-access-5q7bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.938859 4891 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.938889 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q7bd\" (UniqueName: \"kubernetes.io/projected/011f8a2f-1062-4b33-8244-235930966cf1-kube-api-access-5q7bd\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:07 crc kubenswrapper[4891]: I0929 09:59:07.938900 4891 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/011f8a2f-1062-4b33-8244-235930966cf1-util\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:08 crc kubenswrapper[4891]: I0929 09:59:08.519601 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" event={"ID":"011f8a2f-1062-4b33-8244-235930966cf1","Type":"ContainerDied","Data":"6d3606606cd9bf1b8de69bd340b4d677c9828fd75db935d303f092e1e99ee666"} Sep 29 09:59:08 crc kubenswrapper[4891]: I0929 09:59:08.519658 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3606606cd9bf1b8de69bd340b4d677c9828fd75db935d303f092e1e99ee666" Sep 29 09:59:08 crc kubenswrapper[4891]: I0929 09:59:08.519689 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.173531 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g"] Sep 29 09:59:10 crc kubenswrapper[4891]: E0929 09:59:10.174079 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="extract" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.174097 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="extract" Sep 29 09:59:10 crc kubenswrapper[4891]: E0929 09:59:10.174118 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="util" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.174126 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="util" Sep 29 09:59:10 crc kubenswrapper[4891]: E0929 09:59:10.174136 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="pull" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.174143 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="pull" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.174374 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="011f8a2f-1062-4b33-8244-235930966cf1" containerName="extract" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.174905 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.180084 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.180077 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-z5mf8" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.180300 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.196053 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g"] Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.276388 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gskv\" (UniqueName: \"kubernetes.io/projected/08922d5d-a7ec-41c0-9085-bcc17847df78-kube-api-access-8gskv\") pod \"nmstate-operator-5d6f6cfd66-qns5g\" (UID: \"08922d5d-a7ec-41c0-9085-bcc17847df78\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.377728 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gskv\" (UniqueName: \"kubernetes.io/projected/08922d5d-a7ec-41c0-9085-bcc17847df78-kube-api-access-8gskv\") pod \"nmstate-operator-5d6f6cfd66-qns5g\" (UID: \"08922d5d-a7ec-41c0-9085-bcc17847df78\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.408399 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gskv\" (UniqueName: \"kubernetes.io/projected/08922d5d-a7ec-41c0-9085-bcc17847df78-kube-api-access-8gskv\") pod \"nmstate-operator-5d6f6cfd66-qns5g\" (UID: \"08922d5d-a7ec-41c0-9085-bcc17847df78\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.491172 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" Sep 29 09:59:10 crc kubenswrapper[4891]: I0929 09:59:10.703391 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g"] Sep 29 09:59:11 crc kubenswrapper[4891]: I0929 09:59:11.537295 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" event={"ID":"08922d5d-a7ec-41c0-9085-bcc17847df78","Type":"ContainerStarted","Data":"d05f820c7a2dabdac57e447b298812f20d099a886b082c15f4f6ea3bfcf13b2e"} Sep 29 09:59:14 crc kubenswrapper[4891]: I0929 09:59:14.555939 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" event={"ID":"08922d5d-a7ec-41c0-9085-bcc17847df78","Type":"ContainerStarted","Data":"841d7451dd7f0ac032dfd43ebc63a657b1ef3181b9194b9e16aa8a525d03c9e4"} Sep 29 09:59:14 crc kubenswrapper[4891]: I0929 09:59:14.572831 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qns5g" podStartSLOduration=1.6151392759999998 podStartE2EDuration="4.572808115s" podCreationTimestamp="2025-09-29 09:59:10 +0000 UTC" firstStartedPulling="2025-09-29 09:59:10.718925843 +0000 UTC m=+680.924094154" lastFinishedPulling="2025-09-29 09:59:13.676594672 +0000 UTC m=+683.881762993" observedRunningTime="2025-09-29 09:59:14.568858006 +0000 UTC m=+684.774026357" watchObservedRunningTime="2025-09-29 09:59:14.572808115 +0000 UTC m=+684.777976436" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.149685 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7mggl"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.153588 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.157037 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.158072 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.158779 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-69fw4" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.163026 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.167699 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7mggl"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.175600 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.206435 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hkgzc"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.207743 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.218694 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwsl\" (UniqueName: \"kubernetes.io/projected/45398cc7-ef38-4555-befb-ac59051493ed-kube-api-access-fgwsl\") pod \"nmstate-metrics-58fcddf996-7mggl\" (UID: \"45398cc7-ef38-4555-befb-ac59051493ed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.219240 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5ww\" (UniqueName: \"kubernetes.io/projected/0cde15e7-98b3-44c6-9d10-927909f5f269-kube-api-access-mn5ww\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.219438 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.293128 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.294137 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.298177 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.298248 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.298392 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-g7d7m" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.315987 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322015 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33746ad6-4439-446b-bea7-2797ca5a9c37-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322098 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322128 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwsl\" (UniqueName: \"kubernetes.io/projected/45398cc7-ef38-4555-befb-ac59051493ed-kube-api-access-fgwsl\") pod \"nmstate-metrics-58fcddf996-7mggl\" (UID: \"45398cc7-ef38-4555-befb-ac59051493ed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322156 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49c8k\" (UniqueName: \"kubernetes.io/projected/393d2298-0458-4346-bfe0-d492fb362511-kube-api-access-49c8k\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322181 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9n6\" (UniqueName: \"kubernetes.io/projected/33746ad6-4439-446b-bea7-2797ca5a9c37-kube-api-access-6q9n6\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322221 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-nmstate-lock\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322246 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-ovs-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322272 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-dbus-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322290 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.322317 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5ww\" (UniqueName: \"kubernetes.io/projected/0cde15e7-98b3-44c6-9d10-927909f5f269-kube-api-access-mn5ww\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: E0929 09:59:21.322311 4891 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 29 09:59:21 crc kubenswrapper[4891]: E0929 09:59:21.322408 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair podName:0cde15e7-98b3-44c6-9d10-927909f5f269 nodeName:}" failed. No retries permitted until 2025-09-29 09:59:21.82237491 +0000 UTC m=+692.027543441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair") pod "nmstate-webhook-6d689559c5-h5mfq" (UID: "0cde15e7-98b3-44c6-9d10-927909f5f269") : secret "openshift-nmstate-webhook" not found Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.341422 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwsl\" (UniqueName: \"kubernetes.io/projected/45398cc7-ef38-4555-befb-ac59051493ed-kube-api-access-fgwsl\") pod \"nmstate-metrics-58fcddf996-7mggl\" (UID: \"45398cc7-ef38-4555-befb-ac59051493ed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.342481 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5ww\" (UniqueName: \"kubernetes.io/projected/0cde15e7-98b3-44c6-9d10-927909f5f269-kube-api-access-mn5ww\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.423801 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49c8k\" (UniqueName: \"kubernetes.io/projected/393d2298-0458-4346-bfe0-d492fb362511-kube-api-access-49c8k\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424375 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9n6\" (UniqueName: \"kubernetes.io/projected/33746ad6-4439-446b-bea7-2797ca5a9c37-kube-api-access-6q9n6\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424429 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-nmstate-lock\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424487 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-ovs-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424548 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-dbus-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424577 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424596 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-nmstate-lock\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.424646 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33746ad6-4439-446b-bea7-2797ca5a9c37-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.425078 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-ovs-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.425174 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/393d2298-0458-4346-bfe0-d492fb362511-dbus-socket\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: E0929 09:59:21.425248 4891 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 29 09:59:21 crc kubenswrapper[4891]: E0929 09:59:21.425405 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert podName:33746ad6-4439-446b-bea7-2797ca5a9c37 nodeName:}" failed. No retries permitted until 2025-09-29 09:59:21.925368688 +0000 UTC m=+692.130537179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-dfnnl" (UID: "33746ad6-4439-446b-bea7-2797ca5a9c37") : secret "plugin-serving-cert" not found Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.425964 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33746ad6-4439-446b-bea7-2797ca5a9c37-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.447525 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49c8k\" (UniqueName: \"kubernetes.io/projected/393d2298-0458-4346-bfe0-d492fb362511-kube-api-access-49c8k\") pod \"nmstate-handler-hkgzc\" (UID: \"393d2298-0458-4346-bfe0-d492fb362511\") " pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.448401 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9n6\" (UniqueName: \"kubernetes.io/projected/33746ad6-4439-446b-bea7-2797ca5a9c37-kube-api-access-6q9n6\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.494929 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.494970 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bc7cf7b67-5q7tg"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.496018 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.520681 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc7cf7b67-5q7tg"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.526439 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-service-ca\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.526899 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-oauth-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.527022 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-trusted-ca-bundle\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.527128 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-oauth-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.527230 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-console-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.527414 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.527525 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cxw\" (UniqueName: \"kubernetes.io/projected/475f4bd6-fef2-4e00-abe2-9689830508c5-kube-api-access-z5cxw\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.553171 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.604519 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hkgzc" event={"ID":"393d2298-0458-4346-bfe0-d492fb362511","Type":"ContainerStarted","Data":"66921c317de43572a0bab0bef83773c271104358d66a43ebc1faa7f64c329695"} Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.628777 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629190 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cxw\" (UniqueName: \"kubernetes.io/projected/475f4bd6-fef2-4e00-abe2-9689830508c5-kube-api-access-z5cxw\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629613 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-service-ca\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629681 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-oauth-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629709 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-trusted-ca-bundle\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629739 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-oauth-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.629763 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-console-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.631433 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-trusted-ca-bundle\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.631721 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-service-ca\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.632586 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-oauth-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.632969 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/475f4bd6-fef2-4e00-abe2-9689830508c5-console-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.638014 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-oauth-config\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.641271 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/475f4bd6-fef2-4e00-abe2-9689830508c5-console-serving-cert\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.651531 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cxw\" (UniqueName: \"kubernetes.io/projected/475f4bd6-fef2-4e00-abe2-9689830508c5-kube-api-access-z5cxw\") pod \"console-5bc7cf7b67-5q7tg\" (UID: \"475f4bd6-fef2-4e00-abe2-9689830508c5\") " pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.831878 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.834992 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0cde15e7-98b3-44c6-9d10-927909f5f269-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-h5mfq\" (UID: \"0cde15e7-98b3-44c6-9d10-927909f5f269\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.867568 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.933814 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.941455 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7mggl"] Sep 29 09:59:21 crc kubenswrapper[4891]: I0929 09:59:21.943468 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33746ad6-4439-446b-bea7-2797ca5a9c37-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-dfnnl\" (UID: \"33746ad6-4439-446b-bea7-2797ca5a9c37\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:21 crc kubenswrapper[4891]: W0929 09:59:21.951456 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45398cc7_ef38_4555_befb_ac59051493ed.slice/crio-390ef41a4a4b9a1b656420b7f97eab38fcb81d595c5e225c91d9aee761ed4b66 WatchSource:0}: Error finding container 390ef41a4a4b9a1b656420b7f97eab38fcb81d595c5e225c91d9aee761ed4b66: Status 404 returned error can't find the container with id 390ef41a4a4b9a1b656420b7f97eab38fcb81d595c5e225c91d9aee761ed4b66 Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.094022 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.216542 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.319414 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq"] Sep 29 09:59:22 crc kubenswrapper[4891]: W0929 09:59:22.351932 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cde15e7_98b3_44c6_9d10_927909f5f269.slice/crio-8f28542012e6c7ccc854e83c262b1ab6338d901b4fca3f928a1283518836e3c8 WatchSource:0}: Error finding container 8f28542012e6c7ccc854e83c262b1ab6338d901b4fca3f928a1283518836e3c8: Status 404 returned error can't find the container with id 8f28542012e6c7ccc854e83c262b1ab6338d901b4fca3f928a1283518836e3c8 Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.378697 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc7cf7b67-5q7tg"] Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.612762 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" event={"ID":"45398cc7-ef38-4555-befb-ac59051493ed","Type":"ContainerStarted","Data":"390ef41a4a4b9a1b656420b7f97eab38fcb81d595c5e225c91d9aee761ed4b66"} Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.615562 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc7cf7b67-5q7tg" event={"ID":"475f4bd6-fef2-4e00-abe2-9689830508c5","Type":"ContainerStarted","Data":"85bac65cb78a80ce446b4fcba19290e3b60d9beb5117f247e9a2e9fc8cd18961"} Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.615640 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc7cf7b67-5q7tg" event={"ID":"475f4bd6-fef2-4e00-abe2-9689830508c5","Type":"ContainerStarted","Data":"cec0585b46c59f00023a04a411f83678ad190427ce2c2cbcf42aa4a8080c4ac3"} Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.621234 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" event={"ID":"0cde15e7-98b3-44c6-9d10-927909f5f269","Type":"ContainerStarted","Data":"8f28542012e6c7ccc854e83c262b1ab6338d901b4fca3f928a1283518836e3c8"} Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.656284 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bc7cf7b67-5q7tg" podStartSLOduration=1.656256488 podStartE2EDuration="1.656256488s" podCreationTimestamp="2025-09-29 09:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:59:22.640332699 +0000 UTC m=+692.845501030" watchObservedRunningTime="2025-09-29 09:59:22.656256488 +0000 UTC m=+692.861424829" Sep 29 09:59:22 crc kubenswrapper[4891]: I0929 09:59:22.659131 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl"] Sep 29 09:59:22 crc kubenswrapper[4891]: W0929 09:59:22.667626 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33746ad6_4439_446b_bea7_2797ca5a9c37.slice/crio-73a7317590cae7dbfc993ccf37556ddf3a4f9caeef716cd3ce06c26ac1edb64d WatchSource:0}: Error finding container 73a7317590cae7dbfc993ccf37556ddf3a4f9caeef716cd3ce06c26ac1edb64d: Status 404 returned error can't find the container with id 73a7317590cae7dbfc993ccf37556ddf3a4f9caeef716cd3ce06c26ac1edb64d Sep 29 09:59:23 crc kubenswrapper[4891]: I0929 09:59:23.631516 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" event={"ID":"33746ad6-4439-446b-bea7-2797ca5a9c37","Type":"ContainerStarted","Data":"73a7317590cae7dbfc993ccf37556ddf3a4f9caeef716cd3ce06c26ac1edb64d"} Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.640511 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hkgzc" event={"ID":"393d2298-0458-4346-bfe0-d492fb362511","Type":"ContainerStarted","Data":"af7e41d104cc7094257972bb5a68bebab45d21c22fbefb6e532be68c4ba4cb48"} Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.640979 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.642488 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" event={"ID":"0cde15e7-98b3-44c6-9d10-927909f5f269","Type":"ContainerStarted","Data":"1886da8eea9124677ad1b1b4dc96f7695f61260a6954a21912796125733bd83d"} Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.642564 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.644845 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" event={"ID":"45398cc7-ef38-4555-befb-ac59051493ed","Type":"ContainerStarted","Data":"c3c0db2abc76cba1e67e3c76f0e10c6eb7a222f384bae76d741edb8265b24b5c"} Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.657399 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hkgzc" podStartSLOduration=1.318558919 podStartE2EDuration="3.657157254s" podCreationTimestamp="2025-09-29 09:59:21 +0000 UTC" firstStartedPulling="2025-09-29 09:59:21.57534371 +0000 UTC m=+691.780512031" lastFinishedPulling="2025-09-29 09:59:23.913942045 +0000 UTC m=+694.119110366" observedRunningTime="2025-09-29 09:59:24.656872325 +0000 UTC m=+694.862040656" watchObservedRunningTime="2025-09-29 09:59:24.657157254 +0000 UTC m=+694.862325575" Sep 29 09:59:24 crc kubenswrapper[4891]: I0929 09:59:24.679397 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" podStartSLOduration=2.120025751 podStartE2EDuration="3.679366262s" podCreationTimestamp="2025-09-29 09:59:21 +0000 UTC" firstStartedPulling="2025-09-29 09:59:22.356318875 +0000 UTC m=+692.561487186" lastFinishedPulling="2025-09-29 09:59:23.915659376 +0000 UTC m=+694.120827697" observedRunningTime="2025-09-29 09:59:24.674566148 +0000 UTC m=+694.879734469" watchObservedRunningTime="2025-09-29 09:59:24.679366262 +0000 UTC m=+694.884534603" Sep 29 09:59:25 crc kubenswrapper[4891]: I0929 09:59:25.659177 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" event={"ID":"33746ad6-4439-446b-bea7-2797ca5a9c37","Type":"ContainerStarted","Data":"960373ed45ec239137717fdc0818302100fc867d454cde20aff76d78790fbb8d"} Sep 29 09:59:25 crc kubenswrapper[4891]: I0929 09:59:25.688247 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-dfnnl" podStartSLOduration=2.43080344 podStartE2EDuration="4.688212873s" podCreationTimestamp="2025-09-29 09:59:21 +0000 UTC" firstStartedPulling="2025-09-29 09:59:22.672112145 +0000 UTC m=+692.877280466" lastFinishedPulling="2025-09-29 09:59:24.929521568 +0000 UTC m=+695.134689899" observedRunningTime="2025-09-29 09:59:25.675300314 +0000 UTC m=+695.880468675" watchObservedRunningTime="2025-09-29 09:59:25.688212873 +0000 UTC m=+695.893381214" Sep 29 09:59:27 crc kubenswrapper[4891]: I0929 09:59:27.679121 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" event={"ID":"45398cc7-ef38-4555-befb-ac59051493ed","Type":"ContainerStarted","Data":"b84967ba024edbfcb05be388352ef44a66c50c0adab12d17f382f7bdc535851a"} Sep 29 09:59:27 crc kubenswrapper[4891]: I0929 09:59:27.705290 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7mggl" podStartSLOduration=2.141219808 podStartE2EDuration="6.705254533s" podCreationTimestamp="2025-09-29 09:59:21 +0000 UTC" firstStartedPulling="2025-09-29 09:59:21.955473796 +0000 UTC m=+692.160642117" lastFinishedPulling="2025-09-29 09:59:26.519508521 +0000 UTC m=+696.724676842" observedRunningTime="2025-09-29 09:59:27.702432208 +0000 UTC m=+697.907600619" watchObservedRunningTime="2025-09-29 09:59:27.705254533 +0000 UTC m=+697.910422894" Sep 29 09:59:31 crc kubenswrapper[4891]: I0929 09:59:31.578411 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hkgzc" Sep 29 09:59:31 crc kubenswrapper[4891]: I0929 09:59:31.868131 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:31 crc kubenswrapper[4891]: I0929 09:59:31.868207 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:31 crc kubenswrapper[4891]: I0929 09:59:31.875609 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:32 crc kubenswrapper[4891]: I0929 09:59:32.720046 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bc7cf7b67-5q7tg" Sep 29 09:59:32 crc kubenswrapper[4891]: I0929 09:59:32.783195 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 09:59:42 crc kubenswrapper[4891]: I0929 09:59:42.103571 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-h5mfq" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.144926 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6"] Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.146836 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.149268 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.199642 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6"] Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.217172 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g2r2\" (UniqueName: \"kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.217266 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.217312 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.318530 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g2r2\" (UniqueName: \"kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.318638 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.318677 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.319601 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.319742 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.345313 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g2r2\" (UniqueName: \"kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.464624 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 09:59:54 crc kubenswrapper[4891]: I0929 09:59:54.973553 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6"] Sep 29 09:59:54 crc kubenswrapper[4891]: W0929 09:59:54.990322 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacde17bc_adb2_4193_a40f_d9a062f4f67a.slice/crio-b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97 WatchSource:0}: Error finding container b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97: Status 404 returned error can't find the container with id b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97 Sep 29 09:59:55 crc kubenswrapper[4891]: I0929 09:59:55.858546 4891 generic.go:334] "Generic (PLEG): container finished" podID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerID="4ec662bb59904b5aa4c85b1dcaaef14c5ee1cabb74ebe70a9e9768fabd88683d" exitCode=0 Sep 29 09:59:55 crc kubenswrapper[4891]: I0929 09:59:55.859044 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" event={"ID":"acde17bc-adb2-4193-a40f-d9a062f4f67a","Type":"ContainerDied","Data":"4ec662bb59904b5aa4c85b1dcaaef14c5ee1cabb74ebe70a9e9768fabd88683d"} Sep 29 09:59:55 crc kubenswrapper[4891]: I0929 09:59:55.859092 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" event={"ID":"acde17bc-adb2-4193-a40f-d9a062f4f67a","Type":"ContainerStarted","Data":"b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97"} Sep 29 09:59:57 crc kubenswrapper[4891]: I0929 09:59:57.820354 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4nznm" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" containerID="cri-o://7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc" gracePeriod=15 Sep 29 09:59:57 crc kubenswrapper[4891]: I0929 09:59:57.883900 4891 generic.go:334] "Generic (PLEG): container finished" podID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerID="bee133a7e5c616ae4ca8d13f6c2595e5fd3367f06652a79fcae2dbf059955fdc" exitCode=0 Sep 29 09:59:57 crc kubenswrapper[4891]: I0929 09:59:57.883983 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" event={"ID":"acde17bc-adb2-4193-a40f-d9a062f4f67a","Type":"ContainerDied","Data":"bee133a7e5c616ae4ca8d13f6c2595e5fd3367f06652a79fcae2dbf059955fdc"} Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.220920 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4nznm_4fe5a5b3-033b-4d7e-8829-65de16f908a2/console/0.log" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.221370 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.276251 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.276345 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.276387 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl2mb\" (UniqueName: \"kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.276432 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.277267 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.277390 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.277515 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.277696 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.277784 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca\") pod \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\" (UID: \"4fe5a5b3-033b-4d7e-8829-65de16f908a2\") " Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.278263 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config" (OuterVolumeSpecName: "console-config") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.278555 4891 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.278737 4891 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.278760 4891 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.278968 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.282466 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb" (OuterVolumeSpecName: "kube-api-access-rl2mb") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "kube-api-access-rl2mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.293550 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.293708 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4fe5a5b3-033b-4d7e-8829-65de16f908a2" (UID: "4fe5a5b3-033b-4d7e-8829-65de16f908a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.380174 4891 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe5a5b3-033b-4d7e-8829-65de16f908a2-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.380222 4891 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.380238 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl2mb\" (UniqueName: \"kubernetes.io/projected/4fe5a5b3-033b-4d7e-8829-65de16f908a2-kube-api-access-rl2mb\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.380249 4891 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe5a5b3-033b-4d7e-8829-65de16f908a2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.895742 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4nznm_4fe5a5b3-033b-4d7e-8829-65de16f908a2/console/0.log" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.896078 4891 generic.go:334] "Generic (PLEG): container finished" podID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerID="7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc" exitCode=2 Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.896148 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nznm" event={"ID":"4fe5a5b3-033b-4d7e-8829-65de16f908a2","Type":"ContainerDied","Data":"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc"} Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.896161 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nznm" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.896184 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nznm" event={"ID":"4fe5a5b3-033b-4d7e-8829-65de16f908a2","Type":"ContainerDied","Data":"5dbb149b0ced16f2e1d3a5c0944fd81f307ef626f577dab6ab9adf0e78d91746"} Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.896208 4891 scope.go:117] "RemoveContainer" containerID="7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.899323 4891 generic.go:334] "Generic (PLEG): container finished" podID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerID="898c6697ca8206da2f7101926d1d76f6808dcb5d41e6cf39bb87be8fffee21c6" exitCode=0 Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.899364 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" event={"ID":"acde17bc-adb2-4193-a40f-d9a062f4f67a","Type":"ContainerDied","Data":"898c6697ca8206da2f7101926d1d76f6808dcb5d41e6cf39bb87be8fffee21c6"} Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.922021 4891 scope.go:117] "RemoveContainer" containerID="7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc" Sep 29 09:59:58 crc kubenswrapper[4891]: E0929 09:59:58.922719 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc\": container with ID starting with 7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc not found: ID does not exist" containerID="7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.922830 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc"} err="failed to get container status \"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc\": rpc error: code = NotFound desc = could not find container \"7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc\": container with ID starting with 7fb5aa24ce710d594b6481c96d1336f97320bdd5a00e6cd418fef47bf3f1fedc not found: ID does not exist" Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.931688 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 09:59:58 crc kubenswrapper[4891]: I0929 09:59:58.938192 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4nznm"] Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.136501 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.140632 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t"] Sep 29 10:00:00 crc kubenswrapper[4891]: E0929 10:00:00.165536 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="util" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.166015 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="util" Sep 29 10:00:00 crc kubenswrapper[4891]: E0929 10:00:00.166117 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="extract" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.166172 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="extract" Sep 29 10:00:00 crc kubenswrapper[4891]: E0929 10:00:00.166275 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.166336 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" Sep 29 10:00:00 crc kubenswrapper[4891]: E0929 10:00:00.166409 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="pull" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.166480 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="pull" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.166987 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="acde17bc-adb2-4193-a40f-d9a062f4f67a" containerName="extract" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.167080 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" containerName="console" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.167871 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t"] Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.168020 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.170760 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.171358 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204058 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util\") pod \"acde17bc-adb2-4193-a40f-d9a062f4f67a\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204127 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g2r2\" (UniqueName: \"kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2\") pod \"acde17bc-adb2-4193-a40f-d9a062f4f67a\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204148 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle\") pod \"acde17bc-adb2-4193-a40f-d9a062f4f67a\" (UID: \"acde17bc-adb2-4193-a40f-d9a062f4f67a\") " Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204410 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204477 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.204533 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gclg\" (UniqueName: \"kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.206190 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle" (OuterVolumeSpecName: "bundle") pod "acde17bc-adb2-4193-a40f-d9a062f4f67a" (UID: "acde17bc-adb2-4193-a40f-d9a062f4f67a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.211110 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2" (OuterVolumeSpecName: "kube-api-access-7g2r2") pod "acde17bc-adb2-4193-a40f-d9a062f4f67a" (UID: "acde17bc-adb2-4193-a40f-d9a062f4f67a"). InnerVolumeSpecName "kube-api-access-7g2r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.222679 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util" (OuterVolumeSpecName: "util") pod "acde17bc-adb2-4193-a40f-d9a062f4f67a" (UID: "acde17bc-adb2-4193-a40f-d9a062f4f67a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306332 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306417 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306456 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gclg\" (UniqueName: \"kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306517 4891 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306530 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g2r2\" (UniqueName: \"kubernetes.io/projected/acde17bc-adb2-4193-a40f-d9a062f4f67a-kube-api-access-7g2r2\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.306540 4891 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acde17bc-adb2-4193-a40f-d9a062f4f67a-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.308299 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.311775 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.326565 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gclg\" (UniqueName: \"kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg\") pod \"collect-profiles-29319000-zw84t\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.402543 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe5a5b3-033b-4d7e-8829-65de16f908a2" path="/var/lib/kubelet/pods/4fe5a5b3-033b-4d7e-8829-65de16f908a2/volumes" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.489879 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.664758 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t"] Sep 29 10:00:00 crc kubenswrapper[4891]: W0929 10:00:00.670629 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa97b41_25d5_4223_9fe0_bb9addf89617.slice/crio-4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8 WatchSource:0}: Error finding container 4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8: Status 404 returned error can't find the container with id 4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8 Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.918834 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" event={"ID":"acde17bc-adb2-4193-a40f-d9a062f4f67a","Type":"ContainerDied","Data":"b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97"} Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.919157 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b50458026ec1a513c0d3cba11f9c941ef6db95b21b397ef49dc0e1d29a915a97" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.918916 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6" Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.920564 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" event={"ID":"7fa97b41-25d5-4223-9fe0-bb9addf89617","Type":"ContainerStarted","Data":"9f1d29ce791e74fa17280ecd74ef283a189c29a7f226f273237905f388bb3fbb"} Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.920609 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" event={"ID":"7fa97b41-25d5-4223-9fe0-bb9addf89617","Type":"ContainerStarted","Data":"4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8"} Sep 29 10:00:00 crc kubenswrapper[4891]: I0929 10:00:00.937763 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" podStartSLOduration=0.937714996 podStartE2EDuration="937.714996ms" podCreationTimestamp="2025-09-29 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:00.935055326 +0000 UTC m=+731.140223657" watchObservedRunningTime="2025-09-29 10:00:00.937714996 +0000 UTC m=+731.142883317" Sep 29 10:00:01 crc kubenswrapper[4891]: I0929 10:00:01.927494 4891 generic.go:334] "Generic (PLEG): container finished" podID="7fa97b41-25d5-4223-9fe0-bb9addf89617" containerID="9f1d29ce791e74fa17280ecd74ef283a189c29a7f226f273237905f388bb3fbb" exitCode=0 Sep 29 10:00:01 crc kubenswrapper[4891]: I0929 10:00:01.927543 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" event={"ID":"7fa97b41-25d5-4223-9fe0-bb9addf89617","Type":"ContainerDied","Data":"9f1d29ce791e74fa17280ecd74ef283a189c29a7f226f273237905f388bb3fbb"} Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.172579 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.242848 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gclg\" (UniqueName: \"kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg\") pod \"7fa97b41-25d5-4223-9fe0-bb9addf89617\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.242917 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume\") pod \"7fa97b41-25d5-4223-9fe0-bb9addf89617\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.242963 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume\") pod \"7fa97b41-25d5-4223-9fe0-bb9addf89617\" (UID: \"7fa97b41-25d5-4223-9fe0-bb9addf89617\") " Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.244004 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fa97b41-25d5-4223-9fe0-bb9addf89617" (UID: "7fa97b41-25d5-4223-9fe0-bb9addf89617"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.248023 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg" (OuterVolumeSpecName: "kube-api-access-6gclg") pod "7fa97b41-25d5-4223-9fe0-bb9addf89617" (UID: "7fa97b41-25d5-4223-9fe0-bb9addf89617"). InnerVolumeSpecName "kube-api-access-6gclg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.248166 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fa97b41-25d5-4223-9fe0-bb9addf89617" (UID: "7fa97b41-25d5-4223-9fe0-bb9addf89617"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.344390 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gclg\" (UniqueName: \"kubernetes.io/projected/7fa97b41-25d5-4223-9fe0-bb9addf89617-kube-api-access-6gclg\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.344450 4891 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fa97b41-25d5-4223-9fe0-bb9addf89617-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.344461 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa97b41-25d5-4223-9fe0-bb9addf89617-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.949881 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" event={"ID":"7fa97b41-25d5-4223-9fe0-bb9addf89617","Type":"ContainerDied","Data":"4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8"} Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.951458 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4360a4379008e8f7f2537206655f5a9a6e0f1aa694aae6892bbf9783db59d7b8" Sep 29 10:00:03 crc kubenswrapper[4891]: I0929 10:00:03.950014 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.285300 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg"] Sep 29 10:00:10 crc kubenswrapper[4891]: E0929 10:00:10.285744 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa97b41-25d5-4223-9fe0-bb9addf89617" containerName="collect-profiles" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.285756 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa97b41-25d5-4223-9fe0-bb9addf89617" containerName="collect-profiles" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.285883 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa97b41-25d5-4223-9fe0-bb9addf89617" containerName="collect-profiles" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.286280 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.290037 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.290295 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-x9rpz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.290589 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.290936 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.293968 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.303707 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg"] Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.339826 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-apiservice-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.339882 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjbj\" (UniqueName: \"kubernetes.io/projected/8a6839a2-c048-442c-a761-c6c1adec39a2-kube-api-access-2fjbj\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.339911 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-webhook-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.441012 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjbj\" (UniqueName: \"kubernetes.io/projected/8a6839a2-c048-442c-a761-c6c1adec39a2-kube-api-access-2fjbj\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.441338 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-webhook-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.441473 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-apiservice-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.449611 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-webhook-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.453355 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a6839a2-c048-442c-a761-c6c1adec39a2-apiservice-cert\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.465687 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjbj\" (UniqueName: \"kubernetes.io/projected/8a6839a2-c048-442c-a761-c6c1adec39a2-kube-api-access-2fjbj\") pod \"metallb-operator-controller-manager-8dcfb8c5d-mbwfg\" (UID: \"8a6839a2-c048-442c-a761-c6c1adec39a2\") " pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.540604 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-679f568586-f4xqz"] Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.541490 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.543493 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.544484 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qfj5s" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.544744 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.553726 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-679f568586-f4xqz"] Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.601323 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.644036 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znmr\" (UniqueName: \"kubernetes.io/projected/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-kube-api-access-6znmr\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.644316 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-apiservice-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.644465 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-webhook-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.746678 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-webhook-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.747284 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znmr\" (UniqueName: \"kubernetes.io/projected/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-kube-api-access-6znmr\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.747343 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-apiservice-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.759680 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-apiservice-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.773772 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-webhook-cert\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.797619 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znmr\" (UniqueName: \"kubernetes.io/projected/4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f-kube-api-access-6znmr\") pod \"metallb-operator-webhook-server-679f568586-f4xqz\" (UID: \"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f\") " pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.857429 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.919071 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg"] Sep 29 10:00:10 crc kubenswrapper[4891]: I0929 10:00:10.985716 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" event={"ID":"8a6839a2-c048-442c-a761-c6c1adec39a2","Type":"ContainerStarted","Data":"2b1dc95603aec6c1dc062e0dc4879c2faaac12626a45ad2e0b850aac035f187c"} Sep 29 10:00:11 crc kubenswrapper[4891]: I0929 10:00:11.132490 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-679f568586-f4xqz"] Sep 29 10:00:11 crc kubenswrapper[4891]: I0929 10:00:11.992185 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" event={"ID":"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f","Type":"ContainerStarted","Data":"6cce5b6c7cb1cc15754c3ce6d618ed51d88715226849842318c33a397fcb6348"} Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.019529 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" event={"ID":"8a6839a2-c048-442c-a761-c6c1adec39a2","Type":"ContainerStarted","Data":"f05e01973f671cb5fbccb6d89192eee9bc4c290ceef2da315d0f2819ab5845db"} Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.020013 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.021986 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" event={"ID":"4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f","Type":"ContainerStarted","Data":"233b6e576a231d08b45966dcf96a04af536394fe6e6c8a7399a97aa6120c8a98"} Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.022066 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.081805 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" podStartSLOduration=1.528740019 podStartE2EDuration="6.081774286s" podCreationTimestamp="2025-09-29 10:00:10 +0000 UTC" firstStartedPulling="2025-09-29 10:00:11.142027932 +0000 UTC m=+741.347196253" lastFinishedPulling="2025-09-29 10:00:15.695062199 +0000 UTC m=+745.900230520" observedRunningTime="2025-09-29 10:00:16.081096066 +0000 UTC m=+746.286264387" watchObservedRunningTime="2025-09-29 10:00:16.081774286 +0000 UTC m=+746.286942597" Sep 29 10:00:16 crc kubenswrapper[4891]: I0929 10:00:16.083906 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" podStartSLOduration=1.4089182199999999 podStartE2EDuration="6.08389971s" podCreationTimestamp="2025-09-29 10:00:10 +0000 UTC" firstStartedPulling="2025-09-29 10:00:10.939265441 +0000 UTC m=+741.144433762" lastFinishedPulling="2025-09-29 10:00:15.614246931 +0000 UTC m=+745.819415252" observedRunningTime="2025-09-29 10:00:16.046960511 +0000 UTC m=+746.252128842" watchObservedRunningTime="2025-09-29 10:00:16.08389971 +0000 UTC m=+746.289068031" Sep 29 10:00:17 crc kubenswrapper[4891]: I0929 10:00:17.549502 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 10:00:17 crc kubenswrapper[4891]: I0929 10:00:17.550217 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" containerID="cri-o://1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee" gracePeriod=30 Sep 29 10:00:17 crc kubenswrapper[4891]: I0929 10:00:17.576516 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 10:00:17 crc kubenswrapper[4891]: I0929 10:00:17.576777 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerName="route-controller-manager" containerID="cri-o://d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68" gracePeriod=30 Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.029759 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.035365 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.035588 4891 generic.go:334] "Generic (PLEG): container finished" podID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerID="d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68" exitCode=0 Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.035648 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" event={"ID":"d102c7ee-0242-4b77-85f4-5ca86e742bf2","Type":"ContainerDied","Data":"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68"} Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.035683 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" event={"ID":"d102c7ee-0242-4b77-85f4-5ca86e742bf2","Type":"ContainerDied","Data":"b11b89e665d67d37bcf8a7d70b3df726f7fad14fb21e472f639ad34a31554d7f"} Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.035702 4891 scope.go:117] "RemoveContainer" containerID="d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.037684 4891 generic.go:334] "Generic (PLEG): container finished" podID="482b69f0-36a6-4320-8ea5-9e1263400532" containerID="1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee" exitCode=0 Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.037710 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" event={"ID":"482b69f0-36a6-4320-8ea5-9e1263400532","Type":"ContainerDied","Data":"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee"} Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.037726 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" event={"ID":"482b69f0-36a6-4320-8ea5-9e1263400532","Type":"ContainerDied","Data":"b0c9357b891901d8cd69d68d211e2d8362e036de8011ed14abb4cf5d85fdd15d"} Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.037775 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7tk8t" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.051476 4891 scope.go:117] "RemoveContainer" containerID="d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68" Sep 29 10:00:18 crc kubenswrapper[4891]: E0929 10:00:18.052350 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68\": container with ID starting with d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68 not found: ID does not exist" containerID="d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.052393 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68"} err="failed to get container status \"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68\": rpc error: code = NotFound desc = could not find container \"d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68\": container with ID starting with d88f71a10573a9d27b4741cd4923c7726c0586c252e629c740fe272d2045cc68 not found: ID does not exist" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.052424 4891 scope.go:117] "RemoveContainer" containerID="1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.073138 4891 scope.go:117] "RemoveContainer" containerID="1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee" Sep 29 10:00:18 crc kubenswrapper[4891]: E0929 10:00:18.073780 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee\": container with ID starting with 1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee not found: ID does not exist" containerID="1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.073853 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee"} err="failed to get container status \"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee\": rpc error: code = NotFound desc = could not find container \"1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee\": container with ID starting with 1ab7a15081adac0dff0dd2fcd4fc62257e79d9a3a7589fb96285af9394bb2dee not found: ID does not exist" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153255 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert\") pod \"482b69f0-36a6-4320-8ea5-9e1263400532\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153319 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkc69\" (UniqueName: \"kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69\") pod \"482b69f0-36a6-4320-8ea5-9e1263400532\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153362 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config\") pod \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153388 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles\") pod \"482b69f0-36a6-4320-8ea5-9e1263400532\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153438 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config\") pod \"482b69f0-36a6-4320-8ea5-9e1263400532\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153464 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca\") pod \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153494 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca\") pod \"482b69f0-36a6-4320-8ea5-9e1263400532\" (UID: \"482b69f0-36a6-4320-8ea5-9e1263400532\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153514 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert\") pod \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.153546 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrnwh\" (UniqueName: \"kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh\") pod \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\" (UID: \"d102c7ee-0242-4b77-85f4-5ca86e742bf2\") " Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.154573 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca" (OuterVolumeSpecName: "client-ca") pod "482b69f0-36a6-4320-8ea5-9e1263400532" (UID: "482b69f0-36a6-4320-8ea5-9e1263400532"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.154566 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "482b69f0-36a6-4320-8ea5-9e1263400532" (UID: "482b69f0-36a6-4320-8ea5-9e1263400532"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.154647 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config" (OuterVolumeSpecName: "config") pod "482b69f0-36a6-4320-8ea5-9e1263400532" (UID: "482b69f0-36a6-4320-8ea5-9e1263400532"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.155234 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca" (OuterVolumeSpecName: "client-ca") pod "d102c7ee-0242-4b77-85f4-5ca86e742bf2" (UID: "d102c7ee-0242-4b77-85f4-5ca86e742bf2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.155253 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config" (OuterVolumeSpecName: "config") pod "d102c7ee-0242-4b77-85f4-5ca86e742bf2" (UID: "d102c7ee-0242-4b77-85f4-5ca86e742bf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.161205 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d102c7ee-0242-4b77-85f4-5ca86e742bf2" (UID: "d102c7ee-0242-4b77-85f4-5ca86e742bf2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.161308 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "482b69f0-36a6-4320-8ea5-9e1263400532" (UID: "482b69f0-36a6-4320-8ea5-9e1263400532"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.161762 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh" (OuterVolumeSpecName: "kube-api-access-mrnwh") pod "d102c7ee-0242-4b77-85f4-5ca86e742bf2" (UID: "d102c7ee-0242-4b77-85f4-5ca86e742bf2"). InnerVolumeSpecName "kube-api-access-mrnwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.163628 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69" (OuterVolumeSpecName: "kube-api-access-mkc69") pod "482b69f0-36a6-4320-8ea5-9e1263400532" (UID: "482b69f0-36a6-4320-8ea5-9e1263400532"). InnerVolumeSpecName "kube-api-access-mkc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254522 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254560 4891 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254568 4891 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254577 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d102c7ee-0242-4b77-85f4-5ca86e742bf2-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254587 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrnwh\" (UniqueName: \"kubernetes.io/projected/d102c7ee-0242-4b77-85f4-5ca86e742bf2-kube-api-access-mrnwh\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254599 4891 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482b69f0-36a6-4320-8ea5-9e1263400532-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254609 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkc69\" (UniqueName: \"kubernetes.io/projected/482b69f0-36a6-4320-8ea5-9e1263400532-kube-api-access-mkc69\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254618 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d102c7ee-0242-4b77-85f4-5ca86e742bf2-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.254626 4891 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/482b69f0-36a6-4320-8ea5-9e1263400532-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.372455 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.381330 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7tk8t"] Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.403469 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" path="/var/lib/kubelet/pods/482b69f0-36a6-4320-8ea5-9e1263400532/volumes" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.986747 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5758b6c4b6-7krvm"] Sep 29 10:00:18 crc kubenswrapper[4891]: E0929 10:00:18.987648 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerName="route-controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.987666 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerName="route-controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: E0929 10:00:18.987696 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.987705 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.987846 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" containerName="route-controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.987864 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="482b69f0-36a6-4320-8ea5-9e1263400532" containerName="controller-manager" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.988550 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.991532 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.991997 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.992237 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.992362 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 10:00:18 crc kubenswrapper[4891]: I0929 10:00:18.992481 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.004685 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.009891 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.017952 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.019981 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.026700 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5758b6c4b6-7krvm"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.045945 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.062058 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064524 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-config\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064587 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/377273bc-da6c-4fea-9055-9d4676c349d7-serving-cert\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064627 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-config\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064651 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-client-ca\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064824 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-client-ca\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.064986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmqs\" (UniqueName: \"kubernetes.io/projected/c1948e7e-f371-4052-866e-932f886f538d-kube-api-access-vwmqs\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.065101 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wt55\" (UniqueName: \"kubernetes.io/projected/377273bc-da6c-4fea-9055-9d4676c349d7-kube-api-access-6wt55\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.065263 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1948e7e-f371-4052-866e-932f886f538d-serving-cert\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.065304 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-proxy-ca-bundles\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.080187 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.083312 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bzpg"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166577 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/377273bc-da6c-4fea-9055-9d4676c349d7-serving-cert\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166641 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-config\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166674 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-client-ca\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166693 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-client-ca\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166712 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmqs\" (UniqueName: \"kubernetes.io/projected/c1948e7e-f371-4052-866e-932f886f538d-kube-api-access-vwmqs\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166734 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wt55\" (UniqueName: \"kubernetes.io/projected/377273bc-da6c-4fea-9055-9d4676c349d7-kube-api-access-6wt55\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166768 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1948e7e-f371-4052-866e-932f886f538d-serving-cert\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166801 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-proxy-ca-bundles\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.166830 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-config\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.168091 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-config\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.169995 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-config\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.170600 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-client-ca\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.171487 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/377273bc-da6c-4fea-9055-9d4676c349d7-serving-cert\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.171611 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1948e7e-f371-4052-866e-932f886f538d-serving-cert\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.173358 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1948e7e-f371-4052-866e-932f886f538d-proxy-ca-bundles\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.173956 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/377273bc-da6c-4fea-9055-9d4676c349d7-client-ca\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.186300 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wt55\" (UniqueName: \"kubernetes.io/projected/377273bc-da6c-4fea-9055-9d4676c349d7-kube-api-access-6wt55\") pod \"route-controller-manager-66456ff4cc-vq8q2\" (UID: \"377273bc-da6c-4fea-9055-9d4676c349d7\") " pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.186738 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmqs\" (UniqueName: \"kubernetes.io/projected/c1948e7e-f371-4052-866e-932f886f538d-kube-api-access-vwmqs\") pod \"controller-manager-5758b6c4b6-7krvm\" (UID: \"c1948e7e-f371-4052-866e-932f886f538d\") " pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.341636 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.364304 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.568957 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5758b6c4b6-7krvm"] Sep 29 10:00:19 crc kubenswrapper[4891]: I0929 10:00:19.828728 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2"] Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.074939 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" event={"ID":"c1948e7e-f371-4052-866e-932f886f538d","Type":"ContainerStarted","Data":"03ff2f373f312c60da6e329f6a77f72bc7e8619eae5106142525311d2185b926"} Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.074987 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" event={"ID":"c1948e7e-f371-4052-866e-932f886f538d","Type":"ContainerStarted","Data":"52929318480de6df5806df52d34774db0801d7c0c263f28cedae334667d86f85"} Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.076112 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.077977 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" event={"ID":"377273bc-da6c-4fea-9055-9d4676c349d7","Type":"ContainerStarted","Data":"1431ad524e4566f2e71c9606f960a5b57af4d2f3e6954319b38f79880f162aba"} Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.078021 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" event={"ID":"377273bc-da6c-4fea-9055-9d4676c349d7","Type":"ContainerStarted","Data":"32d38d98a7d428f01df3ef3ffd12c104672a75a23ee75e7831cf33b5b27334e3"} Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.078586 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.079553 4891 patch_prober.go:28] interesting pod/route-controller-manager-66456ff4cc-vq8q2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.079601 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" podUID="377273bc-da6c-4fea-9055-9d4676c349d7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.095899 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" podStartSLOduration=3.095878273 podStartE2EDuration="3.095878273s" podCreationTimestamp="2025-09-29 10:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:20.093484011 +0000 UTC m=+750.298652352" watchObservedRunningTime="2025-09-29 10:00:20.095878273 +0000 UTC m=+750.301046614" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.111613 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5758b6c4b6-7krvm" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.118640 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" podStartSLOduration=3.118617516 podStartE2EDuration="3.118617516s" podCreationTimestamp="2025-09-29 10:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:20.117071789 +0000 UTC m=+750.322240120" watchObservedRunningTime="2025-09-29 10:00:20.118617516 +0000 UTC m=+750.323785847" Sep 29 10:00:20 crc kubenswrapper[4891]: I0929 10:00:20.404780 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d102c7ee-0242-4b77-85f4-5ca86e742bf2" path="/var/lib/kubelet/pods/d102c7ee-0242-4b77-85f4-5ca86e742bf2/volumes" Sep 29 10:00:21 crc kubenswrapper[4891]: I0929 10:00:21.089864 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66456ff4cc-vq8q2" Sep 29 10:00:26 crc kubenswrapper[4891]: I0929 10:00:26.500734 4891 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 10:00:30 crc kubenswrapper[4891]: I0929 10:00:30.864869 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-679f568586-f4xqz" Sep 29 10:00:36 crc kubenswrapper[4891]: I0929 10:00:36.186681 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:00:36 crc kubenswrapper[4891]: I0929 10:00:36.187296 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.805389 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.810168 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.837065 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.950055 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xp4f\" (UniqueName: \"kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.950147 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:49 crc kubenswrapper[4891]: I0929 10:00:49.950168 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.051844 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.051904 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.051985 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xp4f\" (UniqueName: \"kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.052819 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.052882 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.074024 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xp4f\" (UniqueName: \"kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f\") pod \"redhat-operators-xwrmn\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.150869 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.605048 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-8dcfb8c5d-mbwfg" Sep 29 10:00:50 crc kubenswrapper[4891]: I0929 10:00:50.613194 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:00:50 crc kubenswrapper[4891]: W0929 10:00:50.625041 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2d3a84_80f0_49f5_966c_034ab38dcb98.slice/crio-adc8f477c23090b669cbf854fe396c2d0af8d3de3eeef1a554a1ef1750e27800 WatchSource:0}: Error finding container adc8f477c23090b669cbf854fe396c2d0af8d3de3eeef1a554a1ef1750e27800: Status 404 returned error can't find the container with id adc8f477c23090b669cbf854fe396c2d0af8d3de3eeef1a554a1ef1750e27800 Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.287030 4891 generic.go:334] "Generic (PLEG): container finished" podID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerID="c72cbd92b0776be685a1404b3812a7f90fd92da7306e46f2cf1dc87243c55c8f" exitCode=0 Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.287097 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerDied","Data":"c72cbd92b0776be685a1404b3812a7f90fd92da7306e46f2cf1dc87243c55c8f"} Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.287136 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerStarted","Data":"adc8f477c23090b669cbf854fe396c2d0af8d3de3eeef1a554a1ef1750e27800"} Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.350182 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8hs8n"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.353406 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.356039 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.356620 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.356675 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.357642 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.358015 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xpv5n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.359566 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.379948 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.453936 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wlhcc"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.454992 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.457599 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.457987 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ds6l9" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.458113 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.458253 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.469912 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.469982 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-conf\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470015 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-startup\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470031 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cblpm\" (UniqueName: \"kubernetes.io/projected/88e3267b-49e6-443d-8cc6-285a983b44ec-kube-api-access-cblpm\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470051 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-sockets\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470067 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470096 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-reloader\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470114 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbd4\" (UniqueName: \"kubernetes.io/projected/e14026d1-17ce-4f77-b28a-274da74a5c15-kube-api-access-khbd4\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.470170 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e3267b-49e6-443d-8cc6-285a983b44ec-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.479309 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-2q7w2"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.480541 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.483809 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.492584 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-2q7w2"] Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572180 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572824 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e3267b-49e6-443d-8cc6-285a983b44ec-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572861 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572887 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-conf\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572919 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-cert\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.572953 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzc68\" (UniqueName: \"kubernetes.io/projected/eec47c49-2fdd-4eba-aca2-438041840948-kube-api-access-mzc68\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573014 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-metrics-certs\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573122 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-startup\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.573232 4891 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573453 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-conf\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573518 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cblpm\" (UniqueName: \"kubernetes.io/projected/88e3267b-49e6-443d-8cc6-285a983b44ec-kube-api-access-cblpm\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.573591 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs podName:e14026d1-17ce-4f77-b28a-274da74a5c15 nodeName:}" failed. No retries permitted until 2025-09-29 10:00:52.073541324 +0000 UTC m=+782.278709635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs") pod "frr-k8s-8hs8n" (UID: "e14026d1-17ce-4f77-b28a-274da74a5c15") : secret "frr-k8s-certs-secret" not found Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573833 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-sockets\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573878 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573921 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/eec47c49-2fdd-4eba-aca2-438041840948-metallb-excludel2\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.573961 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574006 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdfnw\" (UniqueName: \"kubernetes.io/projected/105a82c3-b488-41fb-a511-69b3c239dbd2-kube-api-access-cdfnw\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574030 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-reloader\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574071 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbd4\" (UniqueName: \"kubernetes.io/projected/e14026d1-17ce-4f77-b28a-274da74a5c15-kube-api-access-khbd4\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574108 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-sockets\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574225 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e14026d1-17ce-4f77-b28a-274da74a5c15-frr-startup\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574453 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-reloader\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.574466 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.594967 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbd4\" (UniqueName: \"kubernetes.io/projected/e14026d1-17ce-4f77-b28a-274da74a5c15-kube-api-access-khbd4\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.598062 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cblpm\" (UniqueName: \"kubernetes.io/projected/88e3267b-49e6-443d-8cc6-285a983b44ec-kube-api-access-cblpm\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.603039 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e3267b-49e6-443d-8cc6-285a983b44ec-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bb72s\" (UID: \"88e3267b-49e6-443d-8cc6-285a983b44ec\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.675726 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.675890 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-cert\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.675927 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzc68\" (UniqueName: \"kubernetes.io/projected/eec47c49-2fdd-4eba-aca2-438041840948-kube-api-access-mzc68\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.675964 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-metrics-certs\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.675965 4891 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.676006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/eec47c49-2fdd-4eba-aca2-438041840948-metallb-excludel2\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.676037 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.676062 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist podName:eec47c49-2fdd-4eba-aca2-438041840948 nodeName:}" failed. No retries permitted until 2025-09-29 10:00:52.176039115 +0000 UTC m=+782.381207436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist") pod "speaker-wlhcc" (UID: "eec47c49-2fdd-4eba-aca2-438041840948") : secret "metallb-memberlist" not found Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.676096 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfnw\" (UniqueName: \"kubernetes.io/projected/105a82c3-b488-41fb-a511-69b3c239dbd2-kube-api-access-cdfnw\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.676187 4891 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 29 10:00:51 crc kubenswrapper[4891]: E0929 10:00:51.676269 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs podName:eec47c49-2fdd-4eba-aca2-438041840948 nodeName:}" failed. No retries permitted until 2025-09-29 10:00:52.176242221 +0000 UTC m=+782.381410722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs") pod "speaker-wlhcc" (UID: "eec47c49-2fdd-4eba-aca2-438041840948") : secret "speaker-certs-secret" not found Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.677222 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/eec47c49-2fdd-4eba-aca2-438041840948-metallb-excludel2\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.679565 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-metrics-certs\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.679735 4891 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.682217 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.689530 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105a82c3-b488-41fb-a511-69b3c239dbd2-cert\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.702764 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfnw\" (UniqueName: \"kubernetes.io/projected/105a82c3-b488-41fb-a511-69b3c239dbd2-kube-api-access-cdfnw\") pod \"controller-5d688f5ffc-2q7w2\" (UID: \"105a82c3-b488-41fb-a511-69b3c239dbd2\") " pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.704903 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzc68\" (UniqueName: \"kubernetes.io/projected/eec47c49-2fdd-4eba-aca2-438041840948-kube-api-access-mzc68\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:51 crc kubenswrapper[4891]: I0929 10:00:51.799383 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.085556 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.092608 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e14026d1-17ce-4f77-b28a-274da74a5c15-metrics-certs\") pod \"frr-k8s-8hs8n\" (UID: \"e14026d1-17ce-4f77-b28a-274da74a5c15\") " pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.118822 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s"] Sep 29 10:00:52 crc kubenswrapper[4891]: W0929 10:00:52.120753 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e3267b_49e6_443d_8cc6_285a983b44ec.slice/crio-512c5e5fd319698dcd47a5e29fa32b38c1138afd1cf0fa8a975aa517ada9b26c WatchSource:0}: Error finding container 512c5e5fd319698dcd47a5e29fa32b38c1138afd1cf0fa8a975aa517ada9b26c: Status 404 returned error can't find the container with id 512c5e5fd319698dcd47a5e29fa32b38c1138afd1cf0fa8a975aa517ada9b26c Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.187464 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.187580 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:52 crc kubenswrapper[4891]: E0929 10:00:52.187736 4891 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 10:00:52 crc kubenswrapper[4891]: E0929 10:00:52.187902 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist podName:eec47c49-2fdd-4eba-aca2-438041840948 nodeName:}" failed. No retries permitted until 2025-09-29 10:00:53.187868365 +0000 UTC m=+783.393036676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist") pod "speaker-wlhcc" (UID: "eec47c49-2fdd-4eba-aca2-438041840948") : secret "metallb-memberlist" not found Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.193494 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-metrics-certs\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.239155 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-2q7w2"] Sep 29 10:00:52 crc kubenswrapper[4891]: W0929 10:00:52.244095 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105a82c3_b488_41fb_a511_69b3c239dbd2.slice/crio-70d4278c1cf760ac9ac78f3698a3ae65d3739a8141712fe25e03c63689451d83 WatchSource:0}: Error finding container 70d4278c1cf760ac9ac78f3698a3ae65d3739a8141712fe25e03c63689451d83: Status 404 returned error can't find the container with id 70d4278c1cf760ac9ac78f3698a3ae65d3739a8141712fe25e03c63689451d83 Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.273133 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.296076 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerStarted","Data":"951032e0fc2c3ba01f76c28713915ec376e1a5841f5b337c6437d1b01eb967b2"} Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.297822 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" event={"ID":"88e3267b-49e6-443d-8cc6-285a983b44ec","Type":"ContainerStarted","Data":"512c5e5fd319698dcd47a5e29fa32b38c1138afd1cf0fa8a975aa517ada9b26c"} Sep 29 10:00:52 crc kubenswrapper[4891]: I0929 10:00:52.299278 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-2q7w2" event={"ID":"105a82c3-b488-41fb-a511-69b3c239dbd2","Type":"ContainerStarted","Data":"70d4278c1cf760ac9ac78f3698a3ae65d3739a8141712fe25e03c63689451d83"} Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.203097 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.212483 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/eec47c49-2fdd-4eba-aca2-438041840948-memberlist\") pod \"speaker-wlhcc\" (UID: \"eec47c49-2fdd-4eba-aca2-438041840948\") " pod="metallb-system/speaker-wlhcc" Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.270363 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wlhcc" Sep 29 10:00:53 crc kubenswrapper[4891]: W0929 10:00:53.303850 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec47c49_2fdd_4eba_aca2_438041840948.slice/crio-235ac51a1749574126561f10f114d743d6104615fddae5965ae7377dc1fcfd29 WatchSource:0}: Error finding container 235ac51a1749574126561f10f114d743d6104615fddae5965ae7377dc1fcfd29: Status 404 returned error can't find the container with id 235ac51a1749574126561f10f114d743d6104615fddae5965ae7377dc1fcfd29 Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.308682 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-2q7w2" event={"ID":"105a82c3-b488-41fb-a511-69b3c239dbd2","Type":"ContainerStarted","Data":"024d5fb772c8068c8c386fc5af8b2fa287c32dcdce37de54cb6659a8de1103ef"} Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.309464 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.309540 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-2q7w2" event={"ID":"105a82c3-b488-41fb-a511-69b3c239dbd2","Type":"ContainerStarted","Data":"be6da4196379e3970882a70244b28ce7a63d8f604cd07632acdcdd5002d74ad4"} Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.322499 4891 generic.go:334] "Generic (PLEG): container finished" podID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerID="951032e0fc2c3ba01f76c28713915ec376e1a5841f5b337c6437d1b01eb967b2" exitCode=0 Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.322637 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerDied","Data":"951032e0fc2c3ba01f76c28713915ec376e1a5841f5b337c6437d1b01eb967b2"} Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.324266 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"f2efdb2f5cc75155e0d6ede61bb3fee7d869e5d17fb114e174eaff53b4ed76f6"} Sep 29 10:00:53 crc kubenswrapper[4891]: I0929 10:00:53.337052 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-2q7w2" podStartSLOduration=2.337019483 podStartE2EDuration="2.337019483s" podCreationTimestamp="2025-09-29 10:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:53.331504889 +0000 UTC m=+783.536673210" watchObservedRunningTime="2025-09-29 10:00:53.337019483 +0000 UTC m=+783.542187804" Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.337257 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerStarted","Data":"2494e862893727cbc4a78a38f665fc77150708d8b88df3188fc0643413507833"} Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.340050 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wlhcc" event={"ID":"eec47c49-2fdd-4eba-aca2-438041840948","Type":"ContainerStarted","Data":"655354c431180f37878e53f79e400ca4bc6e003ea14c2a225ffe7cefb848fe2f"} Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.340126 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wlhcc" event={"ID":"eec47c49-2fdd-4eba-aca2-438041840948","Type":"ContainerStarted","Data":"99f3807c3bb45d3242c09caaf33db3c4c839321b575adf6838e9a6f57bd8937b"} Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.340138 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wlhcc" event={"ID":"eec47c49-2fdd-4eba-aca2-438041840948","Type":"ContainerStarted","Data":"235ac51a1749574126561f10f114d743d6104615fddae5965ae7377dc1fcfd29"} Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.340470 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wlhcc" Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.366983 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwrmn" podStartSLOduration=2.563863076 podStartE2EDuration="5.366955233s" podCreationTimestamp="2025-09-29 10:00:49 +0000 UTC" firstStartedPulling="2025-09-29 10:00:51.288640309 +0000 UTC m=+781.493808630" lastFinishedPulling="2025-09-29 10:00:54.091732466 +0000 UTC m=+784.296900787" observedRunningTime="2025-09-29 10:00:54.362284323 +0000 UTC m=+784.567452644" watchObservedRunningTime="2025-09-29 10:00:54.366955233 +0000 UTC m=+784.572123564" Sep 29 10:00:54 crc kubenswrapper[4891]: I0929 10:00:54.393280 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wlhcc" podStartSLOduration=3.3932568180000002 podStartE2EDuration="3.393256818s" podCreationTimestamp="2025-09-29 10:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:54.391257508 +0000 UTC m=+784.596425839" watchObservedRunningTime="2025-09-29 10:00:54.393256818 +0000 UTC m=+784.598425149" Sep 29 10:01:00 crc kubenswrapper[4891]: I0929 10:01:00.151759 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:00 crc kubenswrapper[4891]: I0929 10:01:00.152646 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:00 crc kubenswrapper[4891]: I0929 10:01:00.225602 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:00 crc kubenswrapper[4891]: I0929 10:01:00.435231 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:00 crc kubenswrapper[4891]: I0929 10:01:00.494961 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:01:02 crc kubenswrapper[4891]: I0929 10:01:02.403345 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwrmn" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="registry-server" containerID="cri-o://2494e862893727cbc4a78a38f665fc77150708d8b88df3188fc0643413507833" gracePeriod=2 Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.313478 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wlhcc" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.418702 4891 generic.go:334] "Generic (PLEG): container finished" podID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerID="2494e862893727cbc4a78a38f665fc77150708d8b88df3188fc0643413507833" exitCode=0 Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.418758 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerDied","Data":"2494e862893727cbc4a78a38f665fc77150708d8b88df3188fc0643413507833"} Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.473063 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.578641 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xp4f\" (UniqueName: \"kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f\") pod \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.578840 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content\") pod \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.578891 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities\") pod \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\" (UID: \"6a2d3a84-80f0-49f5-966c-034ab38dcb98\") " Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.580207 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities" (OuterVolumeSpecName: "utilities") pod "6a2d3a84-80f0-49f5-966c-034ab38dcb98" (UID: "6a2d3a84-80f0-49f5-966c-034ab38dcb98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.589196 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f" (OuterVolumeSpecName: "kube-api-access-8xp4f") pod "6a2d3a84-80f0-49f5-966c-034ab38dcb98" (UID: "6a2d3a84-80f0-49f5-966c-034ab38dcb98"). InnerVolumeSpecName "kube-api-access-8xp4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.681463 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.681519 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xp4f\" (UniqueName: \"kubernetes.io/projected/6a2d3a84-80f0-49f5-966c-034ab38dcb98-kube-api-access-8xp4f\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.787210 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a2d3a84-80f0-49f5-966c-034ab38dcb98" (UID: "6a2d3a84-80f0-49f5-966c-034ab38dcb98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4891]: I0929 10:01:03.884822 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d3a84-80f0-49f5-966c-034ab38dcb98-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.428712 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwrmn" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.428710 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwrmn" event={"ID":"6a2d3a84-80f0-49f5-966c-034ab38dcb98","Type":"ContainerDied","Data":"adc8f477c23090b669cbf854fe396c2d0af8d3de3eeef1a554a1ef1750e27800"} Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.429746 4891 scope.go:117] "RemoveContainer" containerID="2494e862893727cbc4a78a38f665fc77150708d8b88df3188fc0643413507833" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.432903 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" event={"ID":"88e3267b-49e6-443d-8cc6-285a983b44ec","Type":"ContainerStarted","Data":"4bd4bf4fec35dad3a25ff1948e9b16b2e9d808c99eca6189e67b37a0fe50019c"} Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.433182 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.437480 4891 generic.go:334] "Generic (PLEG): container finished" podID="e14026d1-17ce-4f77-b28a-274da74a5c15" containerID="22846a5576c6c7d4a79360d7cc95088e0cd20af995b7c41360b1748383288cad" exitCode=0 Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.437538 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerDied","Data":"22846a5576c6c7d4a79360d7cc95088e0cd20af995b7c41360b1748383288cad"} Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.454997 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" podStartSLOduration=2.298513013 podStartE2EDuration="13.454968423s" podCreationTimestamp="2025-09-29 10:00:51 +0000 UTC" firstStartedPulling="2025-09-29 10:00:52.123242135 +0000 UTC m=+782.328410456" lastFinishedPulling="2025-09-29 10:01:03.279697535 +0000 UTC m=+793.484865866" observedRunningTime="2025-09-29 10:01:04.44715681 +0000 UTC m=+794.652325141" watchObservedRunningTime="2025-09-29 10:01:04.454968423 +0000 UTC m=+794.660136754" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.472741 4891 scope.go:117] "RemoveContainer" containerID="951032e0fc2c3ba01f76c28713915ec376e1a5841f5b337c6437d1b01eb967b2" Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.497893 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.502603 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwrmn"] Sep 29 10:01:04 crc kubenswrapper[4891]: I0929 10:01:04.524109 4891 scope.go:117] "RemoveContainer" containerID="c72cbd92b0776be685a1404b3812a7f90fd92da7306e46f2cf1dc87243c55c8f" Sep 29 10:01:05 crc kubenswrapper[4891]: I0929 10:01:05.448153 4891 generic.go:334] "Generic (PLEG): container finished" podID="e14026d1-17ce-4f77-b28a-274da74a5c15" containerID="15bc22d630d838ea16637d01a83139ce5356b98e744db74216dc53772910d1ee" exitCode=0 Sep 29 10:01:05 crc kubenswrapper[4891]: I0929 10:01:05.448227 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerDied","Data":"15bc22d630d838ea16637d01a83139ce5356b98e744db74216dc53772910d1ee"} Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.185670 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.186020 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.234383 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:06 crc kubenswrapper[4891]: E0929 10:01:06.234610 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="registry-server" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.234622 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="registry-server" Sep 29 10:01:06 crc kubenswrapper[4891]: E0929 10:01:06.234633 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="extract-utilities" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.234639 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="extract-utilities" Sep 29 10:01:06 crc kubenswrapper[4891]: E0929 10:01:06.234655 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="extract-content" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.234661 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="extract-content" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.234757 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" containerName="registry-server" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.235169 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.244212 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.244500 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.244655 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b656n" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.250388 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.320700 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prndr\" (UniqueName: \"kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr\") pod \"openstack-operator-index-7tbrb\" (UID: \"514420d4-e9ef-4e04-ba98-7e9adaa19cfb\") " pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.404359 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2d3a84-80f0-49f5-966c-034ab38dcb98" path="/var/lib/kubelet/pods/6a2d3a84-80f0-49f5-966c-034ab38dcb98/volumes" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.422928 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prndr\" (UniqueName: \"kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr\") pod \"openstack-operator-index-7tbrb\" (UID: \"514420d4-e9ef-4e04-ba98-7e9adaa19cfb\") " pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.445534 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prndr\" (UniqueName: \"kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr\") pod \"openstack-operator-index-7tbrb\" (UID: \"514420d4-e9ef-4e04-ba98-7e9adaa19cfb\") " pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.457665 4891 generic.go:334] "Generic (PLEG): container finished" podID="e14026d1-17ce-4f77-b28a-274da74a5c15" containerID="671b2ffc36584c03e46b97a124bcf207b1649b367b1dd6e9e9b3906044ce53d6" exitCode=0 Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.457725 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerDied","Data":"671b2ffc36584c03e46b97a124bcf207b1649b367b1dd6e9e9b3906044ce53d6"} Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.549257 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:06 crc kubenswrapper[4891]: I0929 10:01:06.987633 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:06 crc kubenswrapper[4891]: W0929 10:01:06.994388 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod514420d4_e9ef_4e04_ba98_7e9adaa19cfb.slice/crio-1df3bf08409de82d66d1d80e10f6a413997eba98476e6f397172cb11e4e0ec64 WatchSource:0}: Error finding container 1df3bf08409de82d66d1d80e10f6a413997eba98476e6f397172cb11e4e0ec64: Status 404 returned error can't find the container with id 1df3bf08409de82d66d1d80e10f6a413997eba98476e6f397172cb11e4e0ec64 Sep 29 10:01:07 crc kubenswrapper[4891]: I0929 10:01:07.468825 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tbrb" event={"ID":"514420d4-e9ef-4e04-ba98-7e9adaa19cfb","Type":"ContainerStarted","Data":"1df3bf08409de82d66d1d80e10f6a413997eba98476e6f397172cb11e4e0ec64"} Sep 29 10:01:07 crc kubenswrapper[4891]: I0929 10:01:07.473374 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"59d5fbb0a78941478edf85b479f65c5d131681cce793def6f5a176cb53fa6131"} Sep 29 10:01:07 crc kubenswrapper[4891]: I0929 10:01:07.473439 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"b011ee33eeb2c52d61222ec70fa7ee7a56867aa6313fce8c72a13cd5db88da95"} Sep 29 10:01:07 crc kubenswrapper[4891]: I0929 10:01:07.473452 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"a839c5055c0dbda24b909528c71c52275a8d603a95442e6c7654afdc4a076810"} Sep 29 10:01:07 crc kubenswrapper[4891]: I0929 10:01:07.473466 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"db2e26c387d43501881af2d80efd42f27debdf42f6228e04c3f98063fbb9c6e0"} Sep 29 10:01:08 crc kubenswrapper[4891]: I0929 10:01:08.488578 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"f99a9b114d530f1706c1e42e2816b7ef0a805ee7f3e88d1f45f296f018331ca9"} Sep 29 10:01:08 crc kubenswrapper[4891]: I0929 10:01:08.489052 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:01:08 crc kubenswrapper[4891]: I0929 10:01:08.489072 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8hs8n" event={"ID":"e14026d1-17ce-4f77-b28a-274da74a5c15","Type":"ContainerStarted","Data":"8b98e425da65ee24d2d8a4c70a70b237a7f7748a5dc4a5b8345645c5b01be212"} Sep 29 10:01:08 crc kubenswrapper[4891]: I0929 10:01:08.521565 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8hs8n" podStartSLOduration=6.679286563 podStartE2EDuration="17.521531492s" podCreationTimestamp="2025-09-29 10:00:51 +0000 UTC" firstStartedPulling="2025-09-29 10:00:52.397891685 +0000 UTC m=+782.603060006" lastFinishedPulling="2025-09-29 10:01:03.240136614 +0000 UTC m=+793.445304935" observedRunningTime="2025-09-29 10:01:08.518242274 +0000 UTC m=+798.723410715" watchObservedRunningTime="2025-09-29 10:01:08.521531492 +0000 UTC m=+798.726699853" Sep 29 10:01:09 crc kubenswrapper[4891]: I0929 10:01:09.488488 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:09 crc kubenswrapper[4891]: I0929 10:01:09.498069 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tbrb" event={"ID":"514420d4-e9ef-4e04-ba98-7e9adaa19cfb","Type":"ContainerStarted","Data":"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d"} Sep 29 10:01:09 crc kubenswrapper[4891]: I0929 10:01:09.521711 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7tbrb" podStartSLOduration=1.328923566 podStartE2EDuration="3.521688402s" podCreationTimestamp="2025-09-29 10:01:06 +0000 UTC" firstStartedPulling="2025-09-29 10:01:06.998439179 +0000 UTC m=+797.203607500" lastFinishedPulling="2025-09-29 10:01:09.191204015 +0000 UTC m=+799.396372336" observedRunningTime="2025-09-29 10:01:09.517362593 +0000 UTC m=+799.722530924" watchObservedRunningTime="2025-09-29 10:01:09.521688402 +0000 UTC m=+799.726856723" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.084847 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qvghn"] Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.085843 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.097950 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qvghn"] Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.198283 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswfx\" (UniqueName: \"kubernetes.io/projected/dffc134e-8cef-47fa-a97b-08b58fee948c-kube-api-access-fswfx\") pod \"openstack-operator-index-qvghn\" (UID: \"dffc134e-8cef-47fa-a97b-08b58fee948c\") " pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.300620 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswfx\" (UniqueName: \"kubernetes.io/projected/dffc134e-8cef-47fa-a97b-08b58fee948c-kube-api-access-fswfx\") pod \"openstack-operator-index-qvghn\" (UID: \"dffc134e-8cef-47fa-a97b-08b58fee948c\") " pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.327184 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswfx\" (UniqueName: \"kubernetes.io/projected/dffc134e-8cef-47fa-a97b-08b58fee948c-kube-api-access-fswfx\") pod \"openstack-operator-index-qvghn\" (UID: \"dffc134e-8cef-47fa-a97b-08b58fee948c\") " pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.409298 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.505343 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7tbrb" podUID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" containerName="registry-server" containerID="cri-o://5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d" gracePeriod=2 Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.658467 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qvghn"] Sep 29 10:01:10 crc kubenswrapper[4891]: W0929 10:01:10.669173 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddffc134e_8cef_47fa_a97b_08b58fee948c.slice/crio-8f839741e488c913cfa4f919e7f06245e15f3bbc8b92c5e79e8503f6f363ff24 WatchSource:0}: Error finding container 8f839741e488c913cfa4f919e7f06245e15f3bbc8b92c5e79e8503f6f363ff24: Status 404 returned error can't find the container with id 8f839741e488c913cfa4f919e7f06245e15f3bbc8b92c5e79e8503f6f363ff24 Sep 29 10:01:10 crc kubenswrapper[4891]: I0929 10:01:10.901874 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.015023 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prndr\" (UniqueName: \"kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr\") pod \"514420d4-e9ef-4e04-ba98-7e9adaa19cfb\" (UID: \"514420d4-e9ef-4e04-ba98-7e9adaa19cfb\") " Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.022719 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr" (OuterVolumeSpecName: "kube-api-access-prndr") pod "514420d4-e9ef-4e04-ba98-7e9adaa19cfb" (UID: "514420d4-e9ef-4e04-ba98-7e9adaa19cfb"). InnerVolumeSpecName "kube-api-access-prndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.117093 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prndr\" (UniqueName: \"kubernetes.io/projected/514420d4-e9ef-4e04-ba98-7e9adaa19cfb-kube-api-access-prndr\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.517501 4891 generic.go:334] "Generic (PLEG): container finished" podID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" containerID="5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d" exitCode=0 Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.517627 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tbrb" event={"ID":"514420d4-e9ef-4e04-ba98-7e9adaa19cfb","Type":"ContainerDied","Data":"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d"} Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.517674 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tbrb" event={"ID":"514420d4-e9ef-4e04-ba98-7e9adaa19cfb","Type":"ContainerDied","Data":"1df3bf08409de82d66d1d80e10f6a413997eba98476e6f397172cb11e4e0ec64"} Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.517708 4891 scope.go:117] "RemoveContainer" containerID="5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.519049 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tbrb" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.526783 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvghn" event={"ID":"dffc134e-8cef-47fa-a97b-08b58fee948c","Type":"ContainerStarted","Data":"90b1bf102153aaefcf3054e5eb0ef24f6b1ce96b234b7845d4d3b7aef2897cb0"} Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.527350 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvghn" event={"ID":"dffc134e-8cef-47fa-a97b-08b58fee948c","Type":"ContainerStarted","Data":"8f839741e488c913cfa4f919e7f06245e15f3bbc8b92c5e79e8503f6f363ff24"} Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.558133 4891 scope.go:117] "RemoveContainer" containerID="5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d" Sep 29 10:01:11 crc kubenswrapper[4891]: E0929 10:01:11.564135 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d\": container with ID starting with 5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d not found: ID does not exist" containerID="5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.564230 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d"} err="failed to get container status \"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d\": rpc error: code = NotFound desc = could not find container \"5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d\": container with ID starting with 5a91c28a873d0548f55112a0bda8c05ce21b9f1922221c6b41658865fae62b4d not found: ID does not exist" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.567177 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qvghn" podStartSLOduration=1.523278269 podStartE2EDuration="1.567145819s" podCreationTimestamp="2025-09-29 10:01:10 +0000 UTC" firstStartedPulling="2025-09-29 10:01:10.674633463 +0000 UTC m=+800.879801784" lastFinishedPulling="2025-09-29 10:01:10.718501013 +0000 UTC m=+800.923669334" observedRunningTime="2025-09-29 10:01:11.558716158 +0000 UTC m=+801.763884519" watchObservedRunningTime="2025-09-29 10:01:11.567145819 +0000 UTC m=+801.772314150" Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.586923 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.593882 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7tbrb"] Sep 29 10:01:11 crc kubenswrapper[4891]: I0929 10:01:11.806188 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-2q7w2" Sep 29 10:01:12 crc kubenswrapper[4891]: I0929 10:01:12.274907 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:01:12 crc kubenswrapper[4891]: I0929 10:01:12.342130 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:01:12 crc kubenswrapper[4891]: I0929 10:01:12.408985 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" path="/var/lib/kubelet/pods/514420d4-e9ef-4e04-ba98-7e9adaa19cfb/volumes" Sep 29 10:01:20 crc kubenswrapper[4891]: I0929 10:01:20.426947 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:20 crc kubenswrapper[4891]: I0929 10:01:20.429755 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:20 crc kubenswrapper[4891]: I0929 10:01:20.459502 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:20 crc kubenswrapper[4891]: I0929 10:01:20.654150 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qvghn" Sep 29 10:01:21 crc kubenswrapper[4891]: I0929 10:01:21.688278 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bb72s" Sep 29 10:01:22 crc kubenswrapper[4891]: I0929 10:01:22.279605 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8hs8n" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.978931 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj"] Sep 29 10:01:27 crc kubenswrapper[4891]: E0929 10:01:27.983391 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" containerName="registry-server" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.983468 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" containerName="registry-server" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.983652 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="514420d4-e9ef-4e04-ba98-7e9adaa19cfb" containerName="registry-server" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.984763 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.987669 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5j7pl" Sep 29 10:01:27 crc kubenswrapper[4891]: I0929 10:01:27.998894 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj"] Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.185994 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.186543 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.186818 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfmj\" (UniqueName: \"kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.288534 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.288910 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.289045 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfmj\" (UniqueName: \"kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.289525 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.289562 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.329870 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfmj\" (UniqueName: \"kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj\") pod \"7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:28 crc kubenswrapper[4891]: I0929 10:01:28.604242 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:29 crc kubenswrapper[4891]: I0929 10:01:29.162475 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj"] Sep 29 10:01:29 crc kubenswrapper[4891]: W0929 10:01:29.173517 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47086f2a_3e89_4170_9f19_5bfd1d07c1ff.slice/crio-c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88 WatchSource:0}: Error finding container c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88: Status 404 returned error can't find the container with id c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88 Sep 29 10:01:29 crc kubenswrapper[4891]: I0929 10:01:29.698442 4891 generic.go:334] "Generic (PLEG): container finished" podID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerID="9636f270809532d8b9732f8a1ed800ba009e506127253b584488184cd58ae168" exitCode=0 Sep 29 10:01:29 crc kubenswrapper[4891]: I0929 10:01:29.698552 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" event={"ID":"47086f2a-3e89-4170-9f19-5bfd1d07c1ff","Type":"ContainerDied","Data":"9636f270809532d8b9732f8a1ed800ba009e506127253b584488184cd58ae168"} Sep 29 10:01:29 crc kubenswrapper[4891]: I0929 10:01:29.699032 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" event={"ID":"47086f2a-3e89-4170-9f19-5bfd1d07c1ff","Type":"ContainerStarted","Data":"c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88"} Sep 29 10:01:30 crc kubenswrapper[4891]: I0929 10:01:30.710597 4891 generic.go:334] "Generic (PLEG): container finished" podID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerID="4b9b6a790857a6933bdfe7b9f754a24fbf04b070e43d3536751304e8a890d8d1" exitCode=0 Sep 29 10:01:30 crc kubenswrapper[4891]: I0929 10:01:30.711011 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" event={"ID":"47086f2a-3e89-4170-9f19-5bfd1d07c1ff","Type":"ContainerDied","Data":"4b9b6a790857a6933bdfe7b9f754a24fbf04b070e43d3536751304e8a890d8d1"} Sep 29 10:01:31 crc kubenswrapper[4891]: I0929 10:01:31.722621 4891 generic.go:334] "Generic (PLEG): container finished" podID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerID="53bf275d70228c490851ea769b5f791c91bb2a8e5f56a982592694cad83d433e" exitCode=0 Sep 29 10:01:31 crc kubenswrapper[4891]: I0929 10:01:31.722779 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" event={"ID":"47086f2a-3e89-4170-9f19-5bfd1d07c1ff","Type":"ContainerDied","Data":"53bf275d70228c490851ea769b5f791c91bb2a8e5f56a982592694cad83d433e"} Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.115464 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.283376 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lfmj\" (UniqueName: \"kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj\") pod \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.283550 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util\") pod \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.283617 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle\") pod \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\" (UID: \"47086f2a-3e89-4170-9f19-5bfd1d07c1ff\") " Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.285271 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle" (OuterVolumeSpecName: "bundle") pod "47086f2a-3e89-4170-9f19-5bfd1d07c1ff" (UID: "47086f2a-3e89-4170-9f19-5bfd1d07c1ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.297102 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj" (OuterVolumeSpecName: "kube-api-access-2lfmj") pod "47086f2a-3e89-4170-9f19-5bfd1d07c1ff" (UID: "47086f2a-3e89-4170-9f19-5bfd1d07c1ff"). InnerVolumeSpecName "kube-api-access-2lfmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.304406 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util" (OuterVolumeSpecName: "util") pod "47086f2a-3e89-4170-9f19-5bfd1d07c1ff" (UID: "47086f2a-3e89-4170-9f19-5bfd1d07c1ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.385577 4891 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-util\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.385611 4891 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.385622 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lfmj\" (UniqueName: \"kubernetes.io/projected/47086f2a-3e89-4170-9f19-5bfd1d07c1ff-kube-api-access-2lfmj\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.743831 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" event={"ID":"47086f2a-3e89-4170-9f19-5bfd1d07c1ff","Type":"ContainerDied","Data":"c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88"} Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.744606 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c488f11a73b51cc317d6a55515cd2ee4683af75ff1d8366bd20190ac1f3b88" Sep 29 10:01:33 crc kubenswrapper[4891]: I0929 10:01:33.744167 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj" Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.186550 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.188643 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.189151 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.190360 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.190530 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20" gracePeriod=600 Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.775801 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20" exitCode=0 Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.775852 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20"} Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.776129 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6"} Sep 29 10:01:36 crc kubenswrapper[4891]: I0929 10:01:36.776163 4891 scope.go:117] "RemoveContainer" containerID="8378d58b094c2cac919b4d4b3b96c7247b1168ddd946002225b707d6b5dec558" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.473001 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn"] Sep 29 10:01:40 crc kubenswrapper[4891]: E0929 10:01:40.474073 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="pull" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.474089 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="pull" Sep 29 10:01:40 crc kubenswrapper[4891]: E0929 10:01:40.474122 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="util" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.474129 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="util" Sep 29 10:01:40 crc kubenswrapper[4891]: E0929 10:01:40.474138 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="extract" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.474144 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="extract" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.474263 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="47086f2a-3e89-4170-9f19-5bfd1d07c1ff" containerName="extract" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.475033 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.478008 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mmhb4" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.513357 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn"] Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.526034 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zvx\" (UniqueName: \"kubernetes.io/projected/01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d-kube-api-access-t2zvx\") pod \"openstack-operator-controller-operator-844b5d775b-wwwqn\" (UID: \"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d\") " pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.627760 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zvx\" (UniqueName: \"kubernetes.io/projected/01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d-kube-api-access-t2zvx\") pod \"openstack-operator-controller-operator-844b5d775b-wwwqn\" (UID: \"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d\") " pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.657612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zvx\" (UniqueName: \"kubernetes.io/projected/01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d-kube-api-access-t2zvx\") pod \"openstack-operator-controller-operator-844b5d775b-wwwqn\" (UID: \"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d\") " pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:40 crc kubenswrapper[4891]: I0929 10:01:40.800162 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:41 crc kubenswrapper[4891]: I0929 10:01:41.120346 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn"] Sep 29 10:01:41 crc kubenswrapper[4891]: I0929 10:01:41.840480 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" event={"ID":"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d","Type":"ContainerStarted","Data":"fcc49b418ff76ac5b8ac22c70c2ac241927f442fef279d396183ad767e5a8e7b"} Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.666614 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.669180 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.686356 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.712315 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.712412 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxx8w\" (UniqueName: \"kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.712496 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.813661 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.813722 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.813770 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxx8w\" (UniqueName: \"kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.814753 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.814853 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:44 crc kubenswrapper[4891]: I0929 10:01:44.836882 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxx8w\" (UniqueName: \"kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w\") pod \"community-operators-td55b\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:45 crc kubenswrapper[4891]: I0929 10:01:45.005261 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:45 crc kubenswrapper[4891]: I0929 10:01:45.875494 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" event={"ID":"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d","Type":"ContainerStarted","Data":"a0c7e05f4615510f30f05a1e68b2e29db9766de020dbae8113060828899baa96"} Sep 29 10:01:45 crc kubenswrapper[4891]: I0929 10:01:45.945361 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:46 crc kubenswrapper[4891]: I0929 10:01:46.900896 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerID="a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78" exitCode=0 Sep 29 10:01:46 crc kubenswrapper[4891]: I0929 10:01:46.901165 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerDied","Data":"a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78"} Sep 29 10:01:46 crc kubenswrapper[4891]: I0929 10:01:46.901283 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerStarted","Data":"1381eda631bef4e026b67ba82934f361c10fd08d4770bcc6f37661d724593b6a"} Sep 29 10:01:48 crc kubenswrapper[4891]: I0929 10:01:48.919386 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerID="eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6" exitCode=0 Sep 29 10:01:48 crc kubenswrapper[4891]: I0929 10:01:48.919453 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerDied","Data":"eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6"} Sep 29 10:01:48 crc kubenswrapper[4891]: I0929 10:01:48.922944 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" event={"ID":"01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d","Type":"ContainerStarted","Data":"271d1ff231a5cbe073fb6625fbb62af6224da9c07d2e1d43537eb3e26753da12"} Sep 29 10:01:48 crc kubenswrapper[4891]: I0929 10:01:48.923173 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:49 crc kubenswrapper[4891]: I0929 10:01:49.934211 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerStarted","Data":"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b"} Sep 29 10:01:49 crc kubenswrapper[4891]: I0929 10:01:49.956870 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" podStartSLOduration=3.120591052 podStartE2EDuration="9.95683631s" podCreationTimestamp="2025-09-29 10:01:40 +0000 UTC" firstStartedPulling="2025-09-29 10:01:41.121142568 +0000 UTC m=+831.326310889" lastFinishedPulling="2025-09-29 10:01:47.957387826 +0000 UTC m=+838.162556147" observedRunningTime="2025-09-29 10:01:49.007265731 +0000 UTC m=+839.212434112" watchObservedRunningTime="2025-09-29 10:01:49.95683631 +0000 UTC m=+840.162004691" Sep 29 10:01:49 crc kubenswrapper[4891]: I0929 10:01:49.957608 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-td55b" podStartSLOduration=3.5449625129999998 podStartE2EDuration="5.957601973s" podCreationTimestamp="2025-09-29 10:01:44 +0000 UTC" firstStartedPulling="2025-09-29 10:01:46.964594506 +0000 UTC m=+837.169762827" lastFinishedPulling="2025-09-29 10:01:49.377233956 +0000 UTC m=+839.582402287" observedRunningTime="2025-09-29 10:01:49.955038786 +0000 UTC m=+840.160207117" watchObservedRunningTime="2025-09-29 10:01:49.957601973 +0000 UTC m=+840.162770334" Sep 29 10:01:50 crc kubenswrapper[4891]: I0929 10:01:50.803969 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-844b5d775b-wwwqn" Sep 29 10:01:55 crc kubenswrapper[4891]: I0929 10:01:55.005814 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:55 crc kubenswrapper[4891]: I0929 10:01:55.006201 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:55 crc kubenswrapper[4891]: I0929 10:01:55.093749 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:56 crc kubenswrapper[4891]: I0929 10:01:56.069116 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:57 crc kubenswrapper[4891]: I0929 10:01:57.447707 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.003770 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-td55b" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="registry-server" containerID="cri-o://778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b" gracePeriod=2 Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.419784 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.588902 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content\") pod \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.589137 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxx8w\" (UniqueName: \"kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w\") pod \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.589348 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities\") pod \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\" (UID: \"ea0618d3-e135-40e4-9014-7f5f60bb5a51\") " Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.590544 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities" (OuterVolumeSpecName: "utilities") pod "ea0618d3-e135-40e4-9014-7f5f60bb5a51" (UID: "ea0618d3-e135-40e4-9014-7f5f60bb5a51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.599754 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w" (OuterVolumeSpecName: "kube-api-access-kxx8w") pod "ea0618d3-e135-40e4-9014-7f5f60bb5a51" (UID: "ea0618d3-e135-40e4-9014-7f5f60bb5a51"). InnerVolumeSpecName "kube-api-access-kxx8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.672424 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea0618d3-e135-40e4-9014-7f5f60bb5a51" (UID: "ea0618d3-e135-40e4-9014-7f5f60bb5a51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.691652 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.691694 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxx8w\" (UniqueName: \"kubernetes.io/projected/ea0618d3-e135-40e4-9014-7f5f60bb5a51-kube-api-access-kxx8w\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:58 crc kubenswrapper[4891]: I0929 10:01:58.691710 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0618d3-e135-40e4-9014-7f5f60bb5a51-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.014021 4891 generic.go:334] "Generic (PLEG): container finished" podID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerID="778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b" exitCode=0 Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.014111 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerDied","Data":"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b"} Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.014128 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td55b" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.014165 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td55b" event={"ID":"ea0618d3-e135-40e4-9014-7f5f60bb5a51","Type":"ContainerDied","Data":"1381eda631bef4e026b67ba82934f361c10fd08d4770bcc6f37661d724593b6a"} Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.014196 4891 scope.go:117] "RemoveContainer" containerID="778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.039913 4891 scope.go:117] "RemoveContainer" containerID="eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.055346 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.062822 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-td55b"] Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.069725 4891 scope.go:117] "RemoveContainer" containerID="a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.087369 4891 scope.go:117] "RemoveContainer" containerID="778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b" Sep 29 10:01:59 crc kubenswrapper[4891]: E0929 10:01:59.087952 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b\": container with ID starting with 778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b not found: ID does not exist" containerID="778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.087993 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b"} err="failed to get container status \"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b\": rpc error: code = NotFound desc = could not find container \"778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b\": container with ID starting with 778892b041ed467828d9394cabb95f17e042606fb3aba288b34f04c9f518f83b not found: ID does not exist" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.088023 4891 scope.go:117] "RemoveContainer" containerID="eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6" Sep 29 10:01:59 crc kubenswrapper[4891]: E0929 10:01:59.088411 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6\": container with ID starting with eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6 not found: ID does not exist" containerID="eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.088435 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6"} err="failed to get container status \"eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6\": rpc error: code = NotFound desc = could not find container \"eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6\": container with ID starting with eed0b5a2720535e603d77f1bf44b24e04d5ecb7fa05fdde06e126e51ed2fbbc6 not found: ID does not exist" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.088449 4891 scope.go:117] "RemoveContainer" containerID="a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78" Sep 29 10:01:59 crc kubenswrapper[4891]: E0929 10:01:59.088674 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78\": container with ID starting with a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78 not found: ID does not exist" containerID="a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78" Sep 29 10:01:59 crc kubenswrapper[4891]: I0929 10:01:59.088698 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78"} err="failed to get container status \"a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78\": rpc error: code = NotFound desc = could not find container \"a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78\": container with ID starting with a9279aba7b4e99faa4aa8cfc4140d906b60466e24a69f1614b734db686f5aa78 not found: ID does not exist" Sep 29 10:02:00 crc kubenswrapper[4891]: I0929 10:02:00.424777 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" path="/var/lib/kubelet/pods/ea0618d3-e135-40e4-9014-7f5f60bb5a51/volumes" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.851896 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:01 crc kubenswrapper[4891]: E0929 10:02:01.852924 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="extract-utilities" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.852948 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="extract-utilities" Sep 29 10:02:01 crc kubenswrapper[4891]: E0929 10:02:01.852968 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="registry-server" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.852981 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="registry-server" Sep 29 10:02:01 crc kubenswrapper[4891]: E0929 10:02:01.853012 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="extract-content" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.853026 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="extract-content" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.853244 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0618d3-e135-40e4-9014-7f5f60bb5a51" containerName="registry-server" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.855043 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:01 crc kubenswrapper[4891]: I0929 10:02:01.880967 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.062902 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.063020 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.063421 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtl4\" (UniqueName: \"kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.164862 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtl4\" (UniqueName: \"kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.164949 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.165006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.165767 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.165898 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.190340 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtl4\" (UniqueName: \"kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4\") pod \"redhat-marketplace-lsvl4\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.483964 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:02 crc kubenswrapper[4891]: I0929 10:02:02.977990 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:02 crc kubenswrapper[4891]: W0929 10:02:02.993813 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb7270e_4994_471c_bb92_356bf4d3708b.slice/crio-94568ef2c293855d12bd84f98673c4a58f6262511426b2ac2cd0015dbc599c51 WatchSource:0}: Error finding container 94568ef2c293855d12bd84f98673c4a58f6262511426b2ac2cd0015dbc599c51: Status 404 returned error can't find the container with id 94568ef2c293855d12bd84f98673c4a58f6262511426b2ac2cd0015dbc599c51 Sep 29 10:02:03 crc kubenswrapper[4891]: I0929 10:02:03.063681 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerStarted","Data":"94568ef2c293855d12bd84f98673c4a58f6262511426b2ac2cd0015dbc599c51"} Sep 29 10:02:04 crc kubenswrapper[4891]: I0929 10:02:04.080589 4891 generic.go:334] "Generic (PLEG): container finished" podID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerID="ed448bc9eca3fa3d4309f4f664c1761e3f24b99958e3f40150e4f202aa6f38f8" exitCode=0 Sep 29 10:02:04 crc kubenswrapper[4891]: I0929 10:02:04.080751 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerDied","Data":"ed448bc9eca3fa3d4309f4f664c1761e3f24b99958e3f40150e4f202aa6f38f8"} Sep 29 10:02:10 crc kubenswrapper[4891]: I0929 10:02:10.138338 4891 generic.go:334] "Generic (PLEG): container finished" podID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerID="08da01f35f49dd808ad69eb4b54ba2fb849128239d3e53dbff036c1374191876" exitCode=0 Sep 29 10:02:10 crc kubenswrapper[4891]: I0929 10:02:10.138460 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerDied","Data":"08da01f35f49dd808ad69eb4b54ba2fb849128239d3e53dbff036c1374191876"} Sep 29 10:02:11 crc kubenswrapper[4891]: I0929 10:02:11.150485 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerStarted","Data":"bb9891fe5e0fd29f211332d0a2e217e070d642a049098675881c5b598a769ff0"} Sep 29 10:02:11 crc kubenswrapper[4891]: I0929 10:02:11.179947 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lsvl4" podStartSLOduration=3.537738038 podStartE2EDuration="10.179920343s" podCreationTimestamp="2025-09-29 10:02:01 +0000 UTC" firstStartedPulling="2025-09-29 10:02:04.083785116 +0000 UTC m=+854.288953447" lastFinishedPulling="2025-09-29 10:02:10.725967431 +0000 UTC m=+860.931135752" observedRunningTime="2025-09-29 10:02:11.170589735 +0000 UTC m=+861.375758056" watchObservedRunningTime="2025-09-29 10:02:11.179920343 +0000 UTC m=+861.385088664" Sep 29 10:02:12 crc kubenswrapper[4891]: I0929 10:02:12.484885 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:12 crc kubenswrapper[4891]: I0929 10:02:12.484980 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:12 crc kubenswrapper[4891]: I0929 10:02:12.552615 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.453036 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.454907 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.465460 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.570289 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75f5m\" (UniqueName: \"kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.570755 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.570821 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.672015 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75f5m\" (UniqueName: \"kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.672075 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.672118 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.672536 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.672712 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.697228 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75f5m\" (UniqueName: \"kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m\") pod \"certified-operators-j55qd\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:19 crc kubenswrapper[4891]: I0929 10:02:19.771495 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:20 crc kubenswrapper[4891]: I0929 10:02:20.318485 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:21 crc kubenswrapper[4891]: I0929 10:02:21.223851 4891 generic.go:334] "Generic (PLEG): container finished" podID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerID="72d6a750e14c9912fe2f6204665b52394f4b6075a101c172e0bb6db86ee18712" exitCode=0 Sep 29 10:02:21 crc kubenswrapper[4891]: I0929 10:02:21.223942 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerDied","Data":"72d6a750e14c9912fe2f6204665b52394f4b6075a101c172e0bb6db86ee18712"} Sep 29 10:02:21 crc kubenswrapper[4891]: I0929 10:02:21.224250 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerStarted","Data":"e1e7df93ac41b415d421756ec32b41ad41312a0a1282a263b09d1716e4228d77"} Sep 29 10:02:22 crc kubenswrapper[4891]: I0929 10:02:22.232152 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerStarted","Data":"96f69b9a0e25905231129df2047a7c80ce2c02d38b6a0d67878fd5f86c3596ac"} Sep 29 10:02:22 crc kubenswrapper[4891]: I0929 10:02:22.544066 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:23 crc kubenswrapper[4891]: I0929 10:02:23.240191 4891 generic.go:334] "Generic (PLEG): container finished" podID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerID="96f69b9a0e25905231129df2047a7c80ce2c02d38b6a0d67878fd5f86c3596ac" exitCode=0 Sep 29 10:02:23 crc kubenswrapper[4891]: I0929 10:02:23.240282 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerDied","Data":"96f69b9a0e25905231129df2047a7c80ce2c02d38b6a0d67878fd5f86c3596ac"} Sep 29 10:02:24 crc kubenswrapper[4891]: I0929 10:02:24.249104 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerStarted","Data":"676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7"} Sep 29 10:02:24 crc kubenswrapper[4891]: I0929 10:02:24.269320 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j55qd" podStartSLOduration=2.736137994 podStartE2EDuration="5.269283261s" podCreationTimestamp="2025-09-29 10:02:19 +0000 UTC" firstStartedPulling="2025-09-29 10:02:21.226686954 +0000 UTC m=+871.431855285" lastFinishedPulling="2025-09-29 10:02:23.759832231 +0000 UTC m=+873.965000552" observedRunningTime="2025-09-29 10:02:24.267063865 +0000 UTC m=+874.472232186" watchObservedRunningTime="2025-09-29 10:02:24.269283261 +0000 UTC m=+874.474451592" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.698072 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.699997 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.702089 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-m66pj" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.711678 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.718022 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.719643 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.723589 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fmxlb" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.730195 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.748363 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.749997 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.753364 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-kpqrb" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.781465 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.781536 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.782672 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.785528 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8mlfl" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.800220 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvsxs\" (UniqueName: \"kubernetes.io/projected/a7ad802e-1b9c-4ab0-a7eb-82932b6f5090-kube-api-access-gvsxs\") pod \"cinder-operator-controller-manager-748c574d75-qmk9v\" (UID: \"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.800491 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js8dm\" (UniqueName: \"kubernetes.io/projected/28d145a8-69b6-4cf0-be6b-8bfbd0d2df07-kube-api-access-js8dm\") pod \"barbican-operator-controller-manager-6495d75b5-w2lkh\" (UID: \"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.807188 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.808833 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.811949 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-bxgdf" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.819922 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.821542 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.828217 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.830771 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mpx7b" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.862887 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.869457 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.897762 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.900417 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.900651 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.901955 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h56jp\" (UniqueName: \"kubernetes.io/projected/40dffb60-1139-4864-b251-0aa8c145b66e-kube-api-access-h56jp\") pod \"glance-operator-controller-manager-67b5d44b7f-r4w8j\" (UID: \"40dffb60-1139-4864-b251-0aa8c145b66e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.902016 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznss\" (UniqueName: \"kubernetes.io/projected/a04ca278-c2e3-4b48-85f8-16972204c367-kube-api-access-vznss\") pod \"designate-operator-controller-manager-7d74f4d695-m5dl2\" (UID: \"a04ca278-c2e3-4b48-85f8-16972204c367\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.902078 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvsxs\" (UniqueName: \"kubernetes.io/projected/a7ad802e-1b9c-4ab0-a7eb-82932b6f5090-kube-api-access-gvsxs\") pod \"cinder-operator-controller-manager-748c574d75-qmk9v\" (UID: \"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.902101 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48cb\" (UniqueName: \"kubernetes.io/projected/b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31-kube-api-access-c48cb\") pod \"heat-operator-controller-manager-8ff95898-x2qwd\" (UID: \"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.902126 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wcl\" (UniqueName: \"kubernetes.io/projected/543e23f1-51b6-489d-91d8-b1550bb69680-kube-api-access-n7wcl\") pod \"horizon-operator-controller-manager-695847bc78-lwsw4\" (UID: \"543e23f1-51b6-489d-91d8-b1550bb69680\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.902178 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js8dm\" (UniqueName: \"kubernetes.io/projected/28d145a8-69b6-4cf0-be6b-8bfbd0d2df07-kube-api-access-js8dm\") pod \"barbican-operator-controller-manager-6495d75b5-w2lkh\" (UID: \"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.905994 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.906365 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-mtnpj" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.913544 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.915018 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.918946 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gz57m" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.933389 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvsxs\" (UniqueName: \"kubernetes.io/projected/a7ad802e-1b9c-4ab0-a7eb-82932b6f5090-kube-api-access-gvsxs\") pod \"cinder-operator-controller-manager-748c574d75-qmk9v\" (UID: \"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.945672 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js8dm\" (UniqueName: \"kubernetes.io/projected/28d145a8-69b6-4cf0-be6b-8bfbd0d2df07-kube-api-access-js8dm\") pod \"barbican-operator-controller-manager-6495d75b5-w2lkh\" (UID: \"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.946389 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.961196 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.962862 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.967420 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qmbqh" Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.990165 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k"] Sep 29 10:02:25 crc kubenswrapper[4891]: I0929 10:02:25.998499 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.000205 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.013850 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.013900 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.030325 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031474 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mbcv\" (UniqueName: \"kubernetes.io/projected/bddd647a-c213-41dd-9f22-3cef16c4622b-kube-api-access-2mbcv\") pod \"ironic-operator-controller-manager-9fc8d5567-8xchz\" (UID: \"bddd647a-c213-41dd-9f22-3cef16c4622b\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031597 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7w97\" (UniqueName: \"kubernetes.io/projected/75843062-7193-4953-add3-5859f3dce7de-kube-api-access-h7w97\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031674 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h56jp\" (UniqueName: \"kubernetes.io/projected/40dffb60-1139-4864-b251-0aa8c145b66e-kube-api-access-h56jp\") pod \"glance-operator-controller-manager-67b5d44b7f-r4w8j\" (UID: \"40dffb60-1139-4864-b251-0aa8c145b66e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031746 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznss\" (UniqueName: \"kubernetes.io/projected/a04ca278-c2e3-4b48-85f8-16972204c367-kube-api-access-vznss\") pod \"designate-operator-controller-manager-7d74f4d695-m5dl2\" (UID: \"a04ca278-c2e3-4b48-85f8-16972204c367\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031933 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48cb\" (UniqueName: \"kubernetes.io/projected/b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31-kube-api-access-c48cb\") pod \"heat-operator-controller-manager-8ff95898-x2qwd\" (UID: \"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.031987 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wcl\" (UniqueName: \"kubernetes.io/projected/543e23f1-51b6-489d-91d8-b1550bb69680-kube-api-access-n7wcl\") pod \"horizon-operator-controller-manager-695847bc78-lwsw4\" (UID: \"543e23f1-51b6-489d-91d8-b1550bb69680\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.032027 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.039365 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.054471 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vfzjw" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.054560 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xlm9q" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.058554 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.075045 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48cb\" (UniqueName: \"kubernetes.io/projected/b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31-kube-api-access-c48cb\") pod \"heat-operator-controller-manager-8ff95898-x2qwd\" (UID: \"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.127283 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h56jp\" (UniqueName: \"kubernetes.io/projected/40dffb60-1139-4864-b251-0aa8c145b66e-kube-api-access-h56jp\") pod \"glance-operator-controller-manager-67b5d44b7f-r4w8j\" (UID: \"40dffb60-1139-4864-b251-0aa8c145b66e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.135110 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.139558 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mbcv\" (UniqueName: \"kubernetes.io/projected/bddd647a-c213-41dd-9f22-3cef16c4622b-kube-api-access-2mbcv\") pod \"ironic-operator-controller-manager-9fc8d5567-8xchz\" (UID: \"bddd647a-c213-41dd-9f22-3cef16c4622b\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.139661 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7w97\" (UniqueName: \"kubernetes.io/projected/75843062-7193-4953-add3-5859f3dce7de-kube-api-access-h7w97\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: E0929 10:02:26.140461 4891 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 29 10:02:26 crc kubenswrapper[4891]: E0929 10:02:26.140518 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert podName:75843062-7193-4953-add3-5859f3dce7de nodeName:}" failed. No retries permitted until 2025-09-29 10:02:26.640495426 +0000 UTC m=+876.845663737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert") pod "infra-operator-controller-manager-858cd69f49-zj5dm" (UID: "75843062-7193-4953-add3-5859f3dce7de") : secret "infra-operator-webhook-server-cert" not found Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.142588 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.190184 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.198984 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.200618 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.206599 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4ddfw" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.237828 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.239183 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.241440 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wcl\" (UniqueName: \"kubernetes.io/projected/543e23f1-51b6-489d-91d8-b1550bb69680-kube-api-access-n7wcl\") pod \"horizon-operator-controller-manager-695847bc78-lwsw4\" (UID: \"543e23f1-51b6-489d-91d8-b1550bb69680\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.241988 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mbcv\" (UniqueName: \"kubernetes.io/projected/bddd647a-c213-41dd-9f22-3cef16c4622b-kube-api-access-2mbcv\") pod \"ironic-operator-controller-manager-9fc8d5567-8xchz\" (UID: \"bddd647a-c213-41dd-9f22-3cef16c4622b\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.243371 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqmt\" (UniqueName: \"kubernetes.io/projected/6467aac8-0edf-44db-b402-518abc31f6a1-kube-api-access-zjqmt\") pod \"keystone-operator-controller-manager-7bf498966c-2nk4k\" (UID: \"6467aac8-0edf-44db-b402-518abc31f6a1\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.243433 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbkmw\" (UniqueName: \"kubernetes.io/projected/9f51bd90-5b61-4cec-875e-d515cc501a22-kube-api-access-bbkmw\") pod \"manila-operator-controller-manager-56cf9c6b99-6jllh\" (UID: \"9f51bd90-5b61-4cec-875e-d515cc501a22\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.243470 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9gcl\" (UniqueName: \"kubernetes.io/projected/177d1c2e-3396-4516-aed4-31227f05abff-kube-api-access-s9gcl\") pod \"neutron-operator-controller-manager-54d766c9f9-f7t8j\" (UID: \"177d1c2e-3396-4516-aed4-31227f05abff\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.243694 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.244047 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lsvl4" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="registry-server" containerID="cri-o://bb9891fe5e0fd29f211332d0a2e217e070d642a049098675881c5b598a769ff0" gracePeriod=2 Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.244454 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznss\" (UniqueName: \"kubernetes.io/projected/a04ca278-c2e3-4b48-85f8-16972204c367-kube-api-access-vznss\") pod \"designate-operator-controller-manager-7d74f4d695-m5dl2\" (UID: \"a04ca278-c2e3-4b48-85f8-16972204c367\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.248667 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xgddk" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.250685 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.258469 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7w97\" (UniqueName: \"kubernetes.io/projected/75843062-7193-4953-add3-5859f3dce7de-kube-api-access-h7w97\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.272707 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.278889 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.280696 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.284587 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gskx2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.291393 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.293843 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.338653 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.340693 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.345967 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfdz\" (UniqueName: \"kubernetes.io/projected/348984e7-163d-4396-84f5-319eb4fc79fb-kube-api-access-8pfdz\") pod \"octavia-operator-controller-manager-76fcc6dc7c-t4j4f\" (UID: \"348984e7-163d-4396-84f5-319eb4fc79fb\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.346032 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqmt\" (UniqueName: \"kubernetes.io/projected/6467aac8-0edf-44db-b402-518abc31f6a1-kube-api-access-zjqmt\") pod \"keystone-operator-controller-manager-7bf498966c-2nk4k\" (UID: \"6467aac8-0edf-44db-b402-518abc31f6a1\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.346054 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbkmw\" (UniqueName: \"kubernetes.io/projected/9f51bd90-5b61-4cec-875e-d515cc501a22-kube-api-access-bbkmw\") pod \"manila-operator-controller-manager-56cf9c6b99-6jllh\" (UID: \"9f51bd90-5b61-4cec-875e-d515cc501a22\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.346078 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9gcl\" (UniqueName: \"kubernetes.io/projected/177d1c2e-3396-4516-aed4-31227f05abff-kube-api-access-s9gcl\") pod \"neutron-operator-controller-manager-54d766c9f9-f7t8j\" (UID: \"177d1c2e-3396-4516-aed4-31227f05abff\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.346120 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kfq\" (UniqueName: \"kubernetes.io/projected/67ca192a-9f26-47d4-b299-35b0522e9e53-kube-api-access-b8kfq\") pod \"mariadb-operator-controller-manager-687b9cf756-gxs5h\" (UID: \"67ca192a-9f26-47d4-b299-35b0522e9e53\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.347210 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kk6kk" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.348739 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.351056 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.352667 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.373808 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-pcwns" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.375928 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.381397 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbkmw\" (UniqueName: \"kubernetes.io/projected/9f51bd90-5b61-4cec-875e-d515cc501a22-kube-api-access-bbkmw\") pod \"manila-operator-controller-manager-56cf9c6b99-6jllh\" (UID: \"9f51bd90-5b61-4cec-875e-d515cc501a22\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.381866 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqmt\" (UniqueName: \"kubernetes.io/projected/6467aac8-0edf-44db-b402-518abc31f6a1-kube-api-access-zjqmt\") pod \"keystone-operator-controller-manager-7bf498966c-2nk4k\" (UID: \"6467aac8-0edf-44db-b402-518abc31f6a1\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.382554 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.389106 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.397140 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9gcl\" (UniqueName: \"kubernetes.io/projected/177d1c2e-3396-4516-aed4-31227f05abff-kube-api-access-s9gcl\") pod \"neutron-operator-controller-manager-54d766c9f9-f7t8j\" (UID: \"177d1c2e-3396-4516-aed4-31227f05abff\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.410856 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447136 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpl5l\" (UniqueName: \"kubernetes.io/projected/293261f0-9425-4e31-a66d-d8ad8a913228-kube-api-access-bpl5l\") pod \"ovn-operator-controller-manager-5f95c46c78-d45q4\" (UID: \"293261f0-9425-4e31-a66d-d8ad8a913228\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447230 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kfq\" (UniqueName: \"kubernetes.io/projected/67ca192a-9f26-47d4-b299-35b0522e9e53-kube-api-access-b8kfq\") pod \"mariadb-operator-controller-manager-687b9cf756-gxs5h\" (UID: \"67ca192a-9f26-47d4-b299-35b0522e9e53\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447260 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcp6j\" (UniqueName: \"kubernetes.io/projected/910f1b22-b26a-4e74-b716-89b912927374-kube-api-access-hcp6j\") pod \"nova-operator-controller-manager-c7c776c96-hk7zb\" (UID: \"910f1b22-b26a-4e74-b716-89b912927374\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447295 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5pg\" (UniqueName: \"kubernetes.io/projected/51a34f9a-d71a-45d0-9a76-01d629fc7d79-kube-api-access-px5pg\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447363 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.447392 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfdz\" (UniqueName: \"kubernetes.io/projected/348984e7-163d-4396-84f5-319eb4fc79fb-kube-api-access-8pfdz\") pod \"octavia-operator-controller-manager-76fcc6dc7c-t4j4f\" (UID: \"348984e7-163d-4396-84f5-319eb4fc79fb\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.453771 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-xlml4"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.455002 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.462912 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.467381 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.479446 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-xlml4"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.479582 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.506198 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9mp6l" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.510663 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.516285 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bgnv7" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.549425 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfdz\" (UniqueName: \"kubernetes.io/projected/348984e7-163d-4396-84f5-319eb4fc79fb-kube-api-access-8pfdz\") pod \"octavia-operator-controller-manager-76fcc6dc7c-t4j4f\" (UID: \"348984e7-163d-4396-84f5-319eb4fc79fb\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.553694 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcp6j\" (UniqueName: \"kubernetes.io/projected/910f1b22-b26a-4e74-b716-89b912927374-kube-api-access-hcp6j\") pod \"nova-operator-controller-manager-c7c776c96-hk7zb\" (UID: \"910f1b22-b26a-4e74-b716-89b912927374\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.553843 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5pg\" (UniqueName: \"kubernetes.io/projected/51a34f9a-d71a-45d0-9a76-01d629fc7d79-kube-api-access-px5pg\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.553899 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbpb4\" (UniqueName: \"kubernetes.io/projected/48c30870-804a-4f13-95f4-ec4a5a02b536-kube-api-access-qbpb4\") pod \"swift-operator-controller-manager-bc7dc7bd9-9xmvx\" (UID: \"48c30870-804a-4f13-95f4-ec4a5a02b536\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.553966 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kfq\" (UniqueName: \"kubernetes.io/projected/67ca192a-9f26-47d4-b299-35b0522e9e53-kube-api-access-b8kfq\") pod \"mariadb-operator-controller-manager-687b9cf756-gxs5h\" (UID: \"67ca192a-9f26-47d4-b299-35b0522e9e53\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.554112 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.554255 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpl5l\" (UniqueName: \"kubernetes.io/projected/293261f0-9425-4e31-a66d-d8ad8a913228-kube-api-access-bpl5l\") pod \"ovn-operator-controller-manager-5f95c46c78-d45q4\" (UID: \"293261f0-9425-4e31-a66d-d8ad8a913228\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.554401 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wvc\" (UniqueName: \"kubernetes.io/projected/950148f3-aa8c-45bd-9922-6c4e2683d004-kube-api-access-p7wvc\") pod \"placement-operator-controller-manager-774b97b48-xlml4\" (UID: \"950148f3-aa8c-45bd-9922-6c4e2683d004\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:26 crc kubenswrapper[4891]: E0929 10:02:26.554645 4891 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 10:02:26 crc kubenswrapper[4891]: E0929 10:02:26.554717 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert podName:51a34f9a-d71a-45d0-9a76-01d629fc7d79 nodeName:}" failed. No retries permitted until 2025-09-29 10:02:27.054689512 +0000 UTC m=+877.259857833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-f5bh2" (UID: "51a34f9a-d71a-45d0-9a76-01d629fc7d79") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.603363 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.603680 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpl5l\" (UniqueName: \"kubernetes.io/projected/293261f0-9425-4e31-a66d-d8ad8a913228-kube-api-access-bpl5l\") pod \"ovn-operator-controller-manager-5f95c46c78-d45q4\" (UID: \"293261f0-9425-4e31-a66d-d8ad8a913228\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.605550 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.606931 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcp6j\" (UniqueName: \"kubernetes.io/projected/910f1b22-b26a-4e74-b716-89b912927374-kube-api-access-hcp6j\") pod \"nova-operator-controller-manager-c7c776c96-hk7zb\" (UID: \"910f1b22-b26a-4e74-b716-89b912927374\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.620416 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.622188 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.622463 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.643019 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bfc5q" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.643029 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5pg\" (UniqueName: \"kubernetes.io/projected/51a34f9a-d71a-45d0-9a76-01d629fc7d79-kube-api-access-px5pg\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.656087 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wvc\" (UniqueName: \"kubernetes.io/projected/950148f3-aa8c-45bd-9922-6c4e2683d004-kube-api-access-p7wvc\") pod \"placement-operator-controller-manager-774b97b48-xlml4\" (UID: \"950148f3-aa8c-45bd-9922-6c4e2683d004\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.656155 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbpb4\" (UniqueName: \"kubernetes.io/projected/48c30870-804a-4f13-95f4-ec4a5a02b536-kube-api-access-qbpb4\") pod \"swift-operator-controller-manager-bc7dc7bd9-9xmvx\" (UID: \"48c30870-804a-4f13-95f4-ec4a5a02b536\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.656193 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.693884 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.695413 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.701589 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75843062-7193-4953-add3-5859f3dce7de-cert\") pod \"infra-operator-controller-manager-858cd69f49-zj5dm\" (UID: \"75843062-7193-4953-add3-5859f3dce7de\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.702198 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.713330 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.729545 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.730548 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbpb4\" (UniqueName: \"kubernetes.io/projected/48c30870-804a-4f13-95f4-ec4a5a02b536-kube-api-access-qbpb4\") pod \"swift-operator-controller-manager-bc7dc7bd9-9xmvx\" (UID: \"48c30870-804a-4f13-95f4-ec4a5a02b536\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.733266 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-d7cb2" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.747867 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.759157 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7z9\" (UniqueName: \"kubernetes.io/projected/539a685d-4cdf-4344-a7a3-448ec5e9ba6e-kube-api-access-9f7z9\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-8kqf5\" (UID: \"539a685d-4cdf-4344-a7a3-448ec5e9ba6e\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.759206 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5mq\" (UniqueName: \"kubernetes.io/projected/59104851-7ccd-446a-9441-ef993caefd10-kube-api-access-dw5mq\") pod \"test-operator-controller-manager-f66b554c6-r2lgd\" (UID: \"59104851-7ccd-446a-9441-ef993caefd10\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.767272 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.787405 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.789057 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.800152 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wvc\" (UniqueName: \"kubernetes.io/projected/950148f3-aa8c-45bd-9922-6c4e2683d004-kube-api-access-p7wvc\") pod \"placement-operator-controller-manager-774b97b48-xlml4\" (UID: \"950148f3-aa8c-45bd-9922-6c4e2683d004\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.802180 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.804190 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-z8cw8" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.828252 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.831394 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.861460 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgq2l\" (UniqueName: \"kubernetes.io/projected/1ab4abbc-82b1-4624-856b-cbd9062184c0-kube-api-access-rgq2l\") pod \"watcher-operator-controller-manager-76669f99c-q67kx\" (UID: \"1ab4abbc-82b1-4624-856b-cbd9062184c0\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.861558 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7z9\" (UniqueName: \"kubernetes.io/projected/539a685d-4cdf-4344-a7a3-448ec5e9ba6e-kube-api-access-9f7z9\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-8kqf5\" (UID: \"539a685d-4cdf-4344-a7a3-448ec5e9ba6e\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.861598 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5mq\" (UniqueName: \"kubernetes.io/projected/59104851-7ccd-446a-9441-ef993caefd10-kube-api-access-dw5mq\") pod \"test-operator-controller-manager-f66b554c6-r2lgd\" (UID: \"59104851-7ccd-446a-9441-ef993caefd10\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.863705 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.887947 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5mq\" (UniqueName: \"kubernetes.io/projected/59104851-7ccd-446a-9441-ef993caefd10-kube-api-access-dw5mq\") pod \"test-operator-controller-manager-f66b554c6-r2lgd\" (UID: \"59104851-7ccd-446a-9441-ef993caefd10\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.892602 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7z9\" (UniqueName: \"kubernetes.io/projected/539a685d-4cdf-4344-a7a3-448ec5e9ba6e-kube-api-access-9f7z9\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-8kqf5\" (UID: \"539a685d-4cdf-4344-a7a3-448ec5e9ba6e\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.901494 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.902987 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.904452 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.906366 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.906614 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rdhk7" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.913946 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.937672 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.955895 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9"] Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.956047 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.961900 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gxnpv" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.964172 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgq2l\" (UniqueName: \"kubernetes.io/projected/1ab4abbc-82b1-4624-856b-cbd9062184c0-kube-api-access-rgq2l\") pod \"watcher-operator-controller-manager-76669f99c-q67kx\" (UID: \"1ab4abbc-82b1-4624-856b-cbd9062184c0\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.964238 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48j6z\" (UniqueName: \"kubernetes.io/projected/26216b37-e307-4ecb-ade6-2402d26f32d9-kube-api-access-48j6z\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:26 crc kubenswrapper[4891]: I0929 10:02:26.964329 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26216b37-e307-4ecb-ade6-2402d26f32d9-cert\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.033711 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgq2l\" (UniqueName: \"kubernetes.io/projected/1ab4abbc-82b1-4624-856b-cbd9062184c0-kube-api-access-rgq2l\") pod \"watcher-operator-controller-manager-76669f99c-q67kx\" (UID: \"1ab4abbc-82b1-4624-856b-cbd9062184c0\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.065799 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26216b37-e307-4ecb-ade6-2402d26f32d9-cert\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.065920 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.065998 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48j6z\" (UniqueName: \"kubernetes.io/projected/26216b37-e307-4ecb-ade6-2402d26f32d9-kube-api-access-48j6z\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.066078 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vqr\" (UniqueName: \"kubernetes.io/projected/31d92d3d-3a46-416c-b5f0-6fb12bb5bead-kube-api-access-m8vqr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-8b4x9\" (UID: \"31d92d3d-3a46-416c-b5f0-6fb12bb5bead\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.067143 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:27 crc kubenswrapper[4891]: E0929 10:02:27.067706 4891 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 10:02:27 crc kubenswrapper[4891]: E0929 10:02:27.067768 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert podName:51a34f9a-d71a-45d0-9a76-01d629fc7d79 nodeName:}" failed. No retries permitted until 2025-09-29 10:02:28.06775067 +0000 UTC m=+878.272918991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-f5bh2" (UID: "51a34f9a-d71a-45d0-9a76-01d629fc7d79") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.071014 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26216b37-e307-4ecb-ade6-2402d26f32d9-cert\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.088237 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48j6z\" (UniqueName: \"kubernetes.io/projected/26216b37-e307-4ecb-ade6-2402d26f32d9-kube-api-access-48j6z\") pod \"openstack-operator-controller-manager-6475d4f6d5-ckgrz\" (UID: \"26216b37-e307-4ecb-ade6-2402d26f32d9\") " pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.128368 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.167361 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.168171 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vqr\" (UniqueName: \"kubernetes.io/projected/31d92d3d-3a46-416c-b5f0-6fb12bb5bead-kube-api-access-m8vqr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-8b4x9\" (UID: \"31d92d3d-3a46-416c-b5f0-6fb12bb5bead\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.205755 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.206943 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vqr\" (UniqueName: \"kubernetes.io/projected/31d92d3d-3a46-416c-b5f0-6fb12bb5bead-kube-api-access-m8vqr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-8b4x9\" (UID: \"31d92d3d-3a46-416c-b5f0-6fb12bb5bead\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.219859 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.360950 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd"] Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.368666 4891 generic.go:334] "Generic (PLEG): container finished" podID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerID="bb9891fe5e0fd29f211332d0a2e217e070d642a049098675881c5b598a769ff0" exitCode=0 Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.368708 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerDied","Data":"bb9891fe5e0fd29f211332d0a2e217e070d642a049098675881c5b598a769ff0"} Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.368978 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh"] Sep 29 10:02:27 crc kubenswrapper[4891]: W0929 10:02:27.445452 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d145a8_69b6_4cf0_be6b_8bfbd0d2df07.slice/crio-1328e8b74dc273df130bbcee101b4077623bd5410bbb8f0d2c8b347040ebd43c WatchSource:0}: Error finding container 1328e8b74dc273df130bbcee101b4077623bd5410bbb8f0d2c8b347040ebd43c: Status 404 returned error can't find the container with id 1328e8b74dc273df130bbcee101b4077623bd5410bbb8f0d2c8b347040ebd43c Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.457272 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.515612 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz"] Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.587806 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content\") pod \"9eb7270e-4994-471c-bb92-356bf4d3708b\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.587852 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdtl4\" (UniqueName: \"kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4\") pod \"9eb7270e-4994-471c-bb92-356bf4d3708b\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.587905 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities\") pod \"9eb7270e-4994-471c-bb92-356bf4d3708b\" (UID: \"9eb7270e-4994-471c-bb92-356bf4d3708b\") " Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.590566 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities" (OuterVolumeSpecName: "utilities") pod "9eb7270e-4994-471c-bb92-356bf4d3708b" (UID: "9eb7270e-4994-471c-bb92-356bf4d3708b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.607854 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4" (OuterVolumeSpecName: "kube-api-access-xdtl4") pod "9eb7270e-4994-471c-bb92-356bf4d3708b" (UID: "9eb7270e-4994-471c-bb92-356bf4d3708b"). InnerVolumeSpecName "kube-api-access-xdtl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.612432 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v"] Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.626463 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9eb7270e-4994-471c-bb92-356bf4d3708b" (UID: "9eb7270e-4994-471c-bb92-356bf4d3708b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.634653 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2"] Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.692064 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.692099 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb7270e-4994-471c-bb92-356bf4d3708b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.692111 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdtl4\" (UniqueName: \"kubernetes.io/projected/9eb7270e-4994-471c-bb92-356bf4d3708b-kube-api-access-xdtl4\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:27 crc kubenswrapper[4891]: I0929 10:02:27.909558 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4"] Sep 29 10:02:27 crc kubenswrapper[4891]: W0929 10:02:27.915068 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543e23f1_51b6_489d_91d8_b1550bb69680.slice/crio-79dcb70bb44ed2d1b3054de78483e91efaec79e8c0aae527b9c69f186be341f9 WatchSource:0}: Error finding container 79dcb70bb44ed2d1b3054de78483e91efaec79e8c0aae527b9c69f186be341f9: Status 404 returned error can't find the container with id 79dcb70bb44ed2d1b3054de78483e91efaec79e8c0aae527b9c69f186be341f9 Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.103611 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.128052 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51a34f9a-d71a-45d0-9a76-01d629fc7d79-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-f5bh2\" (UID: \"51a34f9a-d71a-45d0-9a76-01d629fc7d79\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.275622 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.293843 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.307911 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.333333 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.342301 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.349544 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.362943 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.370589 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-xlml4"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.381939 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.383631 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.387493 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.393259 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.426727 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.474413 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.474471 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.474491 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.480182 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.481219 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" event={"ID":"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07","Type":"ContainerStarted","Data":"1328e8b74dc273df130bbcee101b4077623bd5410bbb8f0d2c8b347040ebd43c"} Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.482896 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbpb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-9xmvx_openstack-operators(48c30870-804a-4f13-95f4-ec4a5a02b536): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.489977 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsvl4" event={"ID":"9eb7270e-4994-471c-bb92-356bf4d3708b","Type":"ContainerDied","Data":"94568ef2c293855d12bd84f98673c4a58f6262511426b2ac2cd0015dbc599c51"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.490051 4891 scope.go:117] "RemoveContainer" containerID="bb9891fe5e0fd29f211332d0a2e217e070d642a049098675881c5b598a769ff0" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.490274 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsvl4" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.496156 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" event={"ID":"a04ca278-c2e3-4b48-85f8-16972204c367","Type":"ContainerStarted","Data":"1320bb5a27538c3da1083c2ce7c8aefb3cc755eeed7f433144823ab14eb1a3b1"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.501841 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" event={"ID":"910f1b22-b26a-4e74-b716-89b912927374","Type":"ContainerStarted","Data":"4d8045c5e933ab81132564f3812cc003ff3c6339cee79c4a2a942b69fc8ed391"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.504718 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" event={"ID":"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31","Type":"ContainerStarted","Data":"8808a08b90d7be3f6507ec5f9b321c4b6e165c6f044c7f775be1d7a8a472d9f9"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.506203 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" event={"ID":"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090","Type":"ContainerStarted","Data":"c70b44a7b5bf49c97091074f84f8868494f9dcd024063327985127d43bbc7e62"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.507430 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" event={"ID":"bddd647a-c213-41dd-9f22-3cef16c4622b","Type":"ContainerStarted","Data":"3a9ad01be85974c68902b4897d46e50465d721c08900d63a9c49b25c8e5c4bac"} Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.510231 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" event={"ID":"543e23f1-51b6-489d-91d8-b1550bb69680","Type":"ContainerStarted","Data":"79dcb70bb44ed2d1b3054de78483e91efaec79e8c0aae527b9c69f186be341f9"} Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.531469 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bdf49c202aba5000737445bc4aeee6c5cdc6dd29c3dcd1394df9f8695830f9c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9gcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54d766c9f9-f7t8j_openstack-operators(177d1c2e-3396-4516-aed4-31227f05abff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: W0929 10:02:28.549850 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab4abbc_82b1_4624_856b_cbd9062184c0.slice/crio-9f1a99cc925eb3655c616015bfee4fc0b11a9233e335d51ca4606b643dd694a2 WatchSource:0}: Error finding container 9f1a99cc925eb3655c616015bfee4fc0b11a9233e335d51ca4606b643dd694a2: Status 404 returned error can't find the container with id 9f1a99cc925eb3655c616015bfee4fc0b11a9233e335d51ca4606b643dd694a2 Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.557445 4891 scope.go:117] "RemoveContainer" containerID="08da01f35f49dd808ad69eb4b54ba2fb849128239d3e53dbff036c1374191876" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.557621 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rgq2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-q67kx_openstack-operators(1ab4abbc-82b1-4624-856b-cbd9062184c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.557899 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dw5mq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-r2lgd_openstack-operators(59104851-7ccd-446a-9441-ef993caefd10): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.558015 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8vqr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-8b4x9_openstack-operators(31d92d3d-3a46-416c-b5f0-6fb12bb5bead): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.559104 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" podUID="31d92d3d-3a46-416c-b5f0-6fb12bb5bead" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.572898 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:8961fc302c92bf476c1d00be0c02e964c449032f8d17672389cff40c71eeb1d3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h56jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-67b5d44b7f-r4w8j_openstack-operators(40dffb60-1139-4864-b251-0aa8c145b66e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.620448 4891 scope.go:117] "RemoveContainer" containerID="ed448bc9eca3fa3d4309f4f664c1761e3f24b99958e3f40150e4f202aa6f38f8" Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.625226 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:28 crc kubenswrapper[4891]: I0929 10:02:28.630105 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsvl4"] Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.879884 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" podUID="48c30870-804a-4f13-95f4-ec4a5a02b536" Sep 29 10:02:28 crc kubenswrapper[4891]: E0929 10:02:28.917415 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" podUID="40dffb60-1139-4864-b251-0aa8c145b66e" Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.022386 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" podUID="59104851-7ccd-446a-9441-ef993caefd10" Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.060137 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" podUID="177d1c2e-3396-4516-aed4-31227f05abff" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.177487 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2"] Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.222858 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" podUID="1ab4abbc-82b1-4624-856b-cbd9062184c0" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.572696 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" event={"ID":"9f51bd90-5b61-4cec-875e-d515cc501a22","Type":"ContainerStarted","Data":"42b90a99017e9f82f266f22f90b5f751465a5fe85a92663543e7e07cdd844619"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.575587 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" event={"ID":"539a685d-4cdf-4344-a7a3-448ec5e9ba6e","Type":"ContainerStarted","Data":"e65a2be7a74118ad846df71eaebc163fdb66278e01d0c9a073492b10e480437a"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.582280 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" event={"ID":"6467aac8-0edf-44db-b402-518abc31f6a1","Type":"ContainerStarted","Data":"4af68ac4fb3b5e03a2a8760747640bc3de16a53d7a383afc2820d58997242bad"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.614721 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" event={"ID":"26216b37-e307-4ecb-ade6-2402d26f32d9","Type":"ContainerStarted","Data":"9d1970d77cd186fdde6df779a524c04b53462d90fc839cd42297f6d236236dae"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.614897 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" event={"ID":"26216b37-e307-4ecb-ade6-2402d26f32d9","Type":"ContainerStarted","Data":"b27cb5301b5c35bb3d06ce3c0ecd52dbe53c58af9a7cb4de6a73dcfb65ab5b6a"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.614944 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" event={"ID":"26216b37-e307-4ecb-ade6-2402d26f32d9","Type":"ContainerStarted","Data":"e4998dd105cb68cc157815d825d702e693172dd3c4bc1dc3be02d4ab28e8a9ed"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.617149 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.646937 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" event={"ID":"31d92d3d-3a46-416c-b5f0-6fb12bb5bead","Type":"ContainerStarted","Data":"31408becc6af5afc636b82a2a2428bc7d9281c4d8d28b03a1dbc678a193ba5c6"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.655016 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" event={"ID":"950148f3-aa8c-45bd-9922-6c4e2683d004","Type":"ContainerStarted","Data":"3bea0439c3126d1e90973fe4647623cc9d83ecf083997a56e42fc18d9d7121b5"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.662947 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" podStartSLOduration=3.66292755 podStartE2EDuration="3.66292755s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:29.658150247 +0000 UTC m=+879.863318568" watchObservedRunningTime="2025-09-29 10:02:29.66292755 +0000 UTC m=+879.868095861" Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.673782 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" podUID="31d92d3d-3a46-416c-b5f0-6fb12bb5bead" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.675905 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" event={"ID":"348984e7-163d-4396-84f5-319eb4fc79fb","Type":"ContainerStarted","Data":"d9a84ad2143cac544b73ea915b3c4157ed6186e102da3bd453389c9166189301"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.683424 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" event={"ID":"177d1c2e-3396-4516-aed4-31227f05abff","Type":"ContainerStarted","Data":"d7f0afa0406df03c67fcc538ea944c9f1def619e478194c181186b074afe02a9"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.683507 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" event={"ID":"177d1c2e-3396-4516-aed4-31227f05abff","Type":"ContainerStarted","Data":"d85b95deec430c79fabf77aef8a2fdb95f98bd153925c487e36a154026e0a27e"} Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.697770 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bdf49c202aba5000737445bc4aeee6c5cdc6dd29c3dcd1394df9f8695830f9c6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" podUID="177d1c2e-3396-4516-aed4-31227f05abff" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.699949 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" event={"ID":"75843062-7193-4953-add3-5859f3dce7de","Type":"ContainerStarted","Data":"4ac0cdbeb5b9278bc32a469fd83ed37bacf73af4fa93c801dcac684f1e5ce943"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.707303 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" event={"ID":"59104851-7ccd-446a-9441-ef993caefd10","Type":"ContainerStarted","Data":"5e6b78c3e9b4879cd019f436193767143e767b97dc0104a9da1a0cd1fdb94966"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.707356 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" event={"ID":"59104851-7ccd-446a-9441-ef993caefd10","Type":"ContainerStarted","Data":"62b1495566759322e2bf6023270bcc482627291614c7dfd2644ba5201de0467a"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.717441 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" event={"ID":"293261f0-9425-4e31-a66d-d8ad8a913228","Type":"ContainerStarted","Data":"36a32b4fb5770e5cb376d84be063b0cf0aedcee4709e493b6392aa8fa5fd3283"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.721880 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" event={"ID":"40dffb60-1139-4864-b251-0aa8c145b66e","Type":"ContainerStarted","Data":"d3b85695f0253128e565175f43c145aa37a87550aead0dc6823a482e75fe09b4"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.721919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" event={"ID":"40dffb60-1139-4864-b251-0aa8c145b66e","Type":"ContainerStarted","Data":"de103972603f6517afd3b1a858dbed0bd943cec76d31c4572f0f1d8c5f79c74f"} Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.725252 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:8961fc302c92bf476c1d00be0c02e964c449032f8d17672389cff40c71eeb1d3\\\"\"" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" podUID="40dffb60-1139-4864-b251-0aa8c145b66e" Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.725703 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" podUID="59104851-7ccd-446a-9441-ef993caefd10" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.745922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" event={"ID":"51a34f9a-d71a-45d0-9a76-01d629fc7d79","Type":"ContainerStarted","Data":"89b0f5edfc2f9780bdb9689786d81919f775057df29eb8aca170889a3563f5f7"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.763679 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" event={"ID":"67ca192a-9f26-47d4-b299-35b0522e9e53","Type":"ContainerStarted","Data":"13bc03ff5b2325fc771693a0de8e36bcb75d0a18cbfd8a459d31139b59937ac9"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.767369 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" event={"ID":"48c30870-804a-4f13-95f4-ec4a5a02b536","Type":"ContainerStarted","Data":"5e190d4a5428d988f2dd380d40491458c7da4b69b38a7ebc2d294aa47ac5b0eb"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.767433 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" event={"ID":"48c30870-804a-4f13-95f4-ec4a5a02b536","Type":"ContainerStarted","Data":"7ba1a136525bd929d69eff39c4fca37b960675bfe799041675d46acf59b4894d"} Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.771694 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" podUID="48c30870-804a-4f13-95f4-ec4a5a02b536" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.772190 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.772212 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.798277 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" event={"ID":"1ab4abbc-82b1-4624-856b-cbd9062184c0","Type":"ContainerStarted","Data":"a77eb7fa8401aa7026167d4e97f49fa596ed6bdf15850d1db851b39ebe41d3dd"} Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.798367 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" event={"ID":"1ab4abbc-82b1-4624-856b-cbd9062184c0","Type":"ContainerStarted","Data":"9f1a99cc925eb3655c616015bfee4fc0b11a9233e335d51ca4606b643dd694a2"} Sep 29 10:02:29 crc kubenswrapper[4891]: E0929 10:02:29.812467 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" podUID="1ab4abbc-82b1-4624-856b-cbd9062184c0" Sep 29 10:02:29 crc kubenswrapper[4891]: I0929 10:02:29.856578 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:30 crc kubenswrapper[4891]: I0929 10:02:30.418823 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" path="/var/lib/kubelet/pods/9eb7270e-4994-471c-bb92-356bf4d3708b/volumes" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.824576 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" podUID="48c30870-804a-4f13-95f4-ec4a5a02b536" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.827869 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" podUID="1ab4abbc-82b1-4624-856b-cbd9062184c0" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.827926 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" podUID="59104851-7ccd-446a-9441-ef993caefd10" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.828080 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:8961fc302c92bf476c1d00be0c02e964c449032f8d17672389cff40c71eeb1d3\\\"\"" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" podUID="40dffb60-1139-4864-b251-0aa8c145b66e" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.828926 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bdf49c202aba5000737445bc4aeee6c5cdc6dd29c3dcd1394df9f8695830f9c6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" podUID="177d1c2e-3396-4516-aed4-31227f05abff" Sep 29 10:02:30 crc kubenswrapper[4891]: E0929 10:02:30.829049 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" podUID="31d92d3d-3a46-416c-b5f0-6fb12bb5bead" Sep 29 10:02:30 crc kubenswrapper[4891]: I0929 10:02:30.899724 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:33 crc kubenswrapper[4891]: I0929 10:02:33.056894 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:33 crc kubenswrapper[4891]: I0929 10:02:33.057515 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j55qd" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="registry-server" containerID="cri-o://676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" gracePeriod=2 Sep 29 10:02:33 crc kubenswrapper[4891]: I0929 10:02:33.852180 4891 generic.go:334] "Generic (PLEG): container finished" podID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerID="676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" exitCode=0 Sep 29 10:02:33 crc kubenswrapper[4891]: I0929 10:02:33.852241 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerDied","Data":"676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7"} Sep 29 10:02:37 crc kubenswrapper[4891]: I0929 10:02:37.216685 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6475d4f6d5-ckgrz" Sep 29 10:02:39 crc kubenswrapper[4891]: E0929 10:02:39.772827 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7 is running failed: container process not found" containerID="676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 10:02:39 crc kubenswrapper[4891]: E0929 10:02:39.773138 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7 is running failed: container process not found" containerID="676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 10:02:39 crc kubenswrapper[4891]: E0929 10:02:39.773449 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7 is running failed: container process not found" containerID="676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 10:02:39 crc kubenswrapper[4891]: E0929 10:02:39.773480 4891 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-j55qd" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="registry-server" Sep 29 10:02:40 crc kubenswrapper[4891]: E0929 10:02:40.085849 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:21cbcae033ffacad33656192f6d2cb7db8502177af45fc7af4d27a79f50982c9" Sep 29 10:02:40 crc kubenswrapper[4891]: E0929 10:02:40.086082 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:21cbcae033ffacad33656192f6d2cb7db8502177af45fc7af4d27a79f50982c9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n7wcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-695847bc78-lwsw4_openstack-operators(543e23f1-51b6-489d-91d8-b1550bb69680): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:41 crc kubenswrapper[4891]: E0929 10:02:41.480502 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6" Sep 29 10:02:41 crc kubenswrapper[4891]: E0929 10:02:41.481695 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px5pg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-f5bh2_openstack-operators(51a34f9a-d71a-45d0-9a76-01d629fc7d79): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:42 crc kubenswrapper[4891]: E0929 10:02:42.065368 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:b98ec0b50404626e0440bcf2e22f8d7ff06d1b1bd99f01830bceb8a2b27aa094" Sep 29 10:02:42 crc kubenswrapper[4891]: E0929 10:02:42.065673 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:b98ec0b50404626e0440bcf2e22f8d7ff06d1b1bd99f01830bceb8a2b27aa094,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2mbcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-9fc8d5567-8xchz_openstack-operators(bddd647a-c213-41dd-9f22-3cef16c4622b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:42 crc kubenswrapper[4891]: E0929 10:02:42.717908 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8" Sep 29 10:02:42 crc kubenswrapper[4891]: E0929 10:02:42.718512 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8pfdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-t4j4f_openstack-operators(348984e7-163d-4396-84f5-319eb4fc79fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:43 crc kubenswrapper[4891]: E0929 10:02:43.201437 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d2eba62b82728578c57f60de5baa3562bc0a355f65123a9e5fedff385988eb64" Sep 29 10:02:43 crc kubenswrapper[4891]: E0929 10:02:43.201756 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d2eba62b82728578c57f60de5baa3562bc0a355f65123a9e5fedff385988eb64,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bbkmw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-56cf9c6b99-6jllh_openstack-operators(9f51bd90-5b61-4cec-875e-d515cc501a22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:43 crc kubenswrapper[4891]: E0929 10:02:43.704371 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:cd16b89f5e23d703fa0183db51a060d8d200bcfe2207b9bf565c73db6b5b9f03" Sep 29 10:02:43 crc kubenswrapper[4891]: E0929 10:02:43.704596 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:cd16b89f5e23d703fa0183db51a060d8d200bcfe2207b9bf565c73db6b5b9f03,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c48cb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-8ff95898-x2qwd_openstack-operators(b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.465248 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0" Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.465980 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9f7z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5bf96cfbc4-8kqf5_openstack-operators(539a685d-4cdf-4344-a7a3-448ec5e9ba6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.520767 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.623019 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities\") pod \"f6488ebe-940b-4de5-8549-f0c903cae2de\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.623152 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content\") pod \"f6488ebe-940b-4de5-8549-f0c903cae2de\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.623178 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75f5m\" (UniqueName: \"kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m\") pod \"f6488ebe-940b-4de5-8549-f0c903cae2de\" (UID: \"f6488ebe-940b-4de5-8549-f0c903cae2de\") " Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.625546 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities" (OuterVolumeSpecName: "utilities") pod "f6488ebe-940b-4de5-8549-f0c903cae2de" (UID: "f6488ebe-940b-4de5-8549-f0c903cae2de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.632398 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m" (OuterVolumeSpecName: "kube-api-access-75f5m") pod "f6488ebe-940b-4de5-8549-f0c903cae2de" (UID: "f6488ebe-940b-4de5-8549-f0c903cae2de"). InnerVolumeSpecName "kube-api-access-75f5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.682521 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6488ebe-940b-4de5-8549-f0c903cae2de" (UID: "f6488ebe-940b-4de5-8549-f0c903cae2de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.726092 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.726673 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75f5m\" (UniqueName: \"kubernetes.io/projected/f6488ebe-940b-4de5-8549-f0c903cae2de-kube-api-access-75f5m\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.726698 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6488ebe-940b-4de5-8549-f0c903cae2de-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.937086 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" podUID="543e23f1-51b6-489d-91d8-b1550bb69680" Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.940933 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" podUID="bddd647a-c213-41dd-9f22-3cef16c4622b" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.972369 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55qd" event={"ID":"f6488ebe-940b-4de5-8549-f0c903cae2de","Type":"ContainerDied","Data":"e1e7df93ac41b415d421756ec32b41ad41312a0a1282a263b09d1716e4228d77"} Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.972427 4891 scope.go:117] "RemoveContainer" containerID="676034d7568cef10854732592a3d9707d2905148558e707f565e3c2d43d412c7" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.972548 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55qd" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.978575 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" event={"ID":"bddd647a-c213-41dd-9f22-3cef16c4622b","Type":"ContainerStarted","Data":"3535428fcd6049d64f20e6a734bd9c2e92b5d6c5f42709914f3ab0fc1b234bb7"} Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.980082 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:b98ec0b50404626e0440bcf2e22f8d7ff06d1b1bd99f01830bceb8a2b27aa094\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" podUID="bddd647a-c213-41dd-9f22-3cef16c4622b" Sep 29 10:02:44 crc kubenswrapper[4891]: I0929 10:02:44.980661 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" event={"ID":"543e23f1-51b6-489d-91d8-b1550bb69680","Type":"ContainerStarted","Data":"67d6253528ab13b4b2281f24bc9827de241b0be291918576264a48c6c9d4edfc"} Sep 29 10:02:44 crc kubenswrapper[4891]: E0929 10:02:44.995140 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:21cbcae033ffacad33656192f6d2cb7db8502177af45fc7af4d27a79f50982c9\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" podUID="543e23f1-51b6-489d-91d8-b1550bb69680" Sep 29 10:02:45 crc kubenswrapper[4891]: I0929 10:02:45.025368 4891 scope.go:117] "RemoveContainer" containerID="96f69b9a0e25905231129df2047a7c80ce2c02d38b6a0d67878fd5f86c3596ac" Sep 29 10:02:45 crc kubenswrapper[4891]: E0929 10:02:45.026562 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" podUID="348984e7-163d-4396-84f5-319eb4fc79fb" Sep 29 10:02:45 crc kubenswrapper[4891]: I0929 10:02:45.058309 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:45 crc kubenswrapper[4891]: I0929 10:02:45.063124 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j55qd"] Sep 29 10:02:45 crc kubenswrapper[4891]: E0929 10:02:45.067549 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" podUID="51a34f9a-d71a-45d0-9a76-01d629fc7d79" Sep 29 10:02:45 crc kubenswrapper[4891]: E0929 10:02:45.076303 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" podUID="9f51bd90-5b61-4cec-875e-d515cc501a22" Sep 29 10:02:45 crc kubenswrapper[4891]: I0929 10:02:45.137027 4891 scope.go:117] "RemoveContainer" containerID="72d6a750e14c9912fe2f6204665b52394f4b6075a101c172e0bb6db86ee18712" Sep 29 10:02:45 crc kubenswrapper[4891]: E0929 10:02:45.229942 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" podUID="539a685d-4cdf-4344-a7a3-448ec5e9ba6e" Sep 29 10:02:45 crc kubenswrapper[4891]: E0929 10:02:45.280403 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" podUID="b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.008922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" event={"ID":"539a685d-4cdf-4344-a7a3-448ec5e9ba6e","Type":"ContainerStarted","Data":"c0e99ad6b9e8a563110007f35a74b92e2fa0db4826f041c73cbe63bd23b9938b"} Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.018924 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" podUID="539a685d-4cdf-4344-a7a3-448ec5e9ba6e" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.029628 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" event={"ID":"75843062-7193-4953-add3-5859f3dce7de","Type":"ContainerStarted","Data":"10f0854da5a8664176a951827eb29dcef5d35ac957b6bd9e02d2d997af0f2172"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.036264 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" event={"ID":"293261f0-9425-4e31-a66d-d8ad8a913228","Type":"ContainerStarted","Data":"d31bf410e0e15fcd1bfb466a7cbf3aa40c60c195a549f4a084e597284a090858"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.055369 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" event={"ID":"910f1b22-b26a-4e74-b716-89b912927374","Type":"ContainerStarted","Data":"1b83d109bba8b0691ba73c958ab3a45b84efda422431949ee2a7bc8bf0cd2647"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.066764 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" event={"ID":"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090","Type":"ContainerStarted","Data":"af6661bf032f83b81834ca8a7e965bfb5d0e8cb4285bd88fc70dcdaded25c244"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.093383 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" event={"ID":"67ca192a-9f26-47d4-b299-35b0522e9e53","Type":"ContainerStarted","Data":"699955d0068641ca4480787803ae874a9eb9ec3ac1e74e4e96a92337a7f0ab3c"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.093462 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" event={"ID":"67ca192a-9f26-47d4-b299-35b0522e9e53","Type":"ContainerStarted","Data":"88faa2f6feebe3ee934be03eacc7d17edb5293d33f59bc72ab3bad68af067226"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.094908 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.144960 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" event={"ID":"9f51bd90-5b61-4cec-875e-d515cc501a22","Type":"ContainerStarted","Data":"8a06cdc1781ce84098be566cbc355ec8c60b50d6db459bfe335a7b42cc751948"} Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.155029 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d2eba62b82728578c57f60de5baa3562bc0a355f65123a9e5fedff385988eb64\\\"\"" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" podUID="9f51bd90-5b61-4cec-875e-d515cc501a22" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.156451 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" event={"ID":"51a34f9a-d71a-45d0-9a76-01d629fc7d79","Type":"ContainerStarted","Data":"0fe32b856b4d0e6521a96556136027b6334edc8134652667019598044422d779"} Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.165892 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" podUID="51a34f9a-d71a-45d0-9a76-01d629fc7d79" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.170436 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" event={"ID":"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07","Type":"ContainerStarted","Data":"5d1b49281c3f7e7e32813302d36bf0b6e5962a52586f25e00488d57690edfd87"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.170776 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" podStartSLOduration=4.904881024 podStartE2EDuration="21.170747537s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.394113229 +0000 UTC m=+878.599281550" lastFinishedPulling="2025-09-29 10:02:44.659979742 +0000 UTC m=+894.865148063" observedRunningTime="2025-09-29 10:02:46.130441464 +0000 UTC m=+896.335609785" watchObservedRunningTime="2025-09-29 10:02:46.170747537 +0000 UTC m=+896.375915878" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.175253 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" event={"ID":"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31","Type":"ContainerStarted","Data":"2162ca2bcf04254887d2d520ecb1652698f7b4016412b6e9d187075af7d9d158"} Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.181258 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:cd16b89f5e23d703fa0183db51a060d8d200bcfe2207b9bf565c73db6b5b9f03\\\"\"" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" podUID="b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.181722 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" event={"ID":"950148f3-aa8c-45bd-9922-6c4e2683d004","Type":"ContainerStarted","Data":"c72e31e609044dfc05f718b8179fadcfaed58eed6d648158fbde9595aaf660be"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.192978 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" event={"ID":"a04ca278-c2e3-4b48-85f8-16972204c367","Type":"ContainerStarted","Data":"05d4161fe8b4668b75eddd346281341eff57e945de5df80cc5712b8226aea181"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.197629 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" event={"ID":"6467aac8-0edf-44db-b402-518abc31f6a1","Type":"ContainerStarted","Data":"1e776436865ddd1b641d1b78c1b56ba1ef7af1422fc1959ddceaba4ad34a3a05"} Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.200908 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" event={"ID":"348984e7-163d-4396-84f5-319eb4fc79fb","Type":"ContainerStarted","Data":"2a908afd2e770f3e6ae3cd2ad0b7cb7eea2158f25efd231ec2f6164408f3a249"} Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.231993 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:b98ec0b50404626e0440bcf2e22f8d7ff06d1b1bd99f01830bceb8a2b27aa094\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" podUID="bddd647a-c213-41dd-9f22-3cef16c4622b" Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.244539 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:21cbcae033ffacad33656192f6d2cb7db8502177af45fc7af4d27a79f50982c9\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" podUID="543e23f1-51b6-489d-91d8-b1550bb69680" Sep 29 10:02:46 crc kubenswrapper[4891]: E0929 10:02:46.244673 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" podUID="348984e7-163d-4396-84f5-319eb4fc79fb" Sep 29 10:02:46 crc kubenswrapper[4891]: I0929 10:02:46.415278 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" path="/var/lib/kubelet/pods/f6488ebe-940b-4de5-8549-f0c903cae2de/volumes" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.209452 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" event={"ID":"293261f0-9425-4e31-a66d-d8ad8a913228","Type":"ContainerStarted","Data":"9a5e24dd6a042483cc1f324d2c21916c39881911ad3ac358822c007ec53ff469"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.210836 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.223704 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" event={"ID":"950148f3-aa8c-45bd-9922-6c4e2683d004","Type":"ContainerStarted","Data":"3bfc0c9c2f75bc7df998c4f6602296e7ea404c7b2948a292327d7b13d7e1751c"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.224625 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.232588 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" podStartSLOduration=5.032393766 podStartE2EDuration="21.232574628s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.480507859 +0000 UTC m=+878.685676180" lastFinishedPulling="2025-09-29 10:02:44.680688721 +0000 UTC m=+894.885857042" observedRunningTime="2025-09-29 10:02:47.229525817 +0000 UTC m=+897.434694148" watchObservedRunningTime="2025-09-29 10:02:47.232574628 +0000 UTC m=+897.437742949" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.238510 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" event={"ID":"a04ca278-c2e3-4b48-85f8-16972204c367","Type":"ContainerStarted","Data":"a1888aa9a1b1aa34dc1a8fe49301b0a6331942022b14b9d092f4878db79ac16b"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.238835 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.257502 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" event={"ID":"a7ad802e-1b9c-4ab0-a7eb-82932b6f5090","Type":"ContainerStarted","Data":"f4a596cabba32f0afa5f1e62690a04dce7296da656001c69b35993d78e34e138"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.257750 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.258224 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" podStartSLOduration=5.02680509 podStartE2EDuration="21.25776004s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.453472092 +0000 UTC m=+878.658640413" lastFinishedPulling="2025-09-29 10:02:44.684427042 +0000 UTC m=+894.889595363" observedRunningTime="2025-09-29 10:02:47.250968797 +0000 UTC m=+897.456137128" watchObservedRunningTime="2025-09-29 10:02:47.25776004 +0000 UTC m=+897.462928361" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.262931 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" event={"ID":"6467aac8-0edf-44db-b402-518abc31f6a1","Type":"ContainerStarted","Data":"7fd6574059f80b6c3b56bbf8dd455ec2f3b085a43caca3031d9ef5a88eac4808"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.263123 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.269269 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" event={"ID":"28d145a8-69b6-4cf0-be6b-8bfbd0d2df07","Type":"ContainerStarted","Data":"5eec0c04e0d3d60df7c52cad7bdbeaceb3e718861fdc05bbb98fa6423672f24e"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.269681 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.271272 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" podStartSLOduration=5.273729195 podStartE2EDuration="22.271252972s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.685622877 +0000 UTC m=+877.890791198" lastFinishedPulling="2025-09-29 10:02:44.683146654 +0000 UTC m=+894.888314975" observedRunningTime="2025-09-29 10:02:47.270007085 +0000 UTC m=+897.475175426" watchObservedRunningTime="2025-09-29 10:02:47.271252972 +0000 UTC m=+897.476421293" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.273730 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" event={"ID":"910f1b22-b26a-4e74-b716-89b912927374","Type":"ContainerStarted","Data":"fb869a80a46024190c198f1c1ca925ebbff531fc543508aebf8340fdbe4a80bf"} Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.274225 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.278602 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" event={"ID":"75843062-7193-4953-add3-5859f3dce7de","Type":"ContainerStarted","Data":"e16eec031c8e47bb84fe16cf82abfa5dd3b6de97e61e536cf3e555ea51314729"} Sep 29 10:02:47 crc kubenswrapper[4891]: E0929 10:02:47.290881 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:cd16b89f5e23d703fa0183db51a060d8d200bcfe2207b9bf565c73db6b5b9f03\\\"\"" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" podUID="b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31" Sep 29 10:02:47 crc kubenswrapper[4891]: E0929 10:02:47.290968 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" podUID="51a34f9a-d71a-45d0-9a76-01d629fc7d79" Sep 29 10:02:47 crc kubenswrapper[4891]: E0929 10:02:47.291066 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d2eba62b82728578c57f60de5baa3562bc0a355f65123a9e5fedff385988eb64\\\"\"" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" podUID="9f51bd90-5b61-4cec-875e-d515cc501a22" Sep 29 10:02:47 crc kubenswrapper[4891]: E0929 10:02:47.291137 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:ae6fda8cafd6c3ab5d5e9c599d15b02ace61b8eacbac4de3df50427dfab6a0c0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" podUID="539a685d-4cdf-4344-a7a3-448ec5e9ba6e" Sep 29 10:02:47 crc kubenswrapper[4891]: E0929 10:02:47.291218 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" podUID="348984e7-163d-4396-84f5-319eb4fc79fb" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.297467 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" podStartSLOduration=5.088294559 podStartE2EDuration="22.297442154s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.474969888 +0000 UTC m=+877.680138209" lastFinishedPulling="2025-09-29 10:02:44.684117483 +0000 UTC m=+894.889285804" observedRunningTime="2025-09-29 10:02:47.290407764 +0000 UTC m=+897.495576095" watchObservedRunningTime="2025-09-29 10:02:47.297442154 +0000 UTC m=+897.502610475" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.326830 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" podStartSLOduration=6.126903117 podStartE2EDuration="22.326806781s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.480283922 +0000 UTC m=+878.685452243" lastFinishedPulling="2025-09-29 10:02:44.680187586 +0000 UTC m=+894.885355907" observedRunningTime="2025-09-29 10:02:47.323332467 +0000 UTC m=+897.528500788" watchObservedRunningTime="2025-09-29 10:02:47.326806781 +0000 UTC m=+897.531975102" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.350270 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" podStartSLOduration=5.31975765 podStartE2EDuration="22.350246621s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.653151178 +0000 UTC m=+877.858319489" lastFinishedPulling="2025-09-29 10:02:44.683640139 +0000 UTC m=+894.888808460" observedRunningTime="2025-09-29 10:02:47.346588232 +0000 UTC m=+897.551756553" watchObservedRunningTime="2025-09-29 10:02:47.350246621 +0000 UTC m=+897.555414952" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.473488 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" podStartSLOduration=5.157506301 podStartE2EDuration="21.473453929s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.392497051 +0000 UTC m=+878.597665372" lastFinishedPulling="2025-09-29 10:02:44.708444679 +0000 UTC m=+894.913613000" observedRunningTime="2025-09-29 10:02:47.460902125 +0000 UTC m=+897.666070456" watchObservedRunningTime="2025-09-29 10:02:47.473453929 +0000 UTC m=+897.678622250" Sep 29 10:02:47 crc kubenswrapper[4891]: I0929 10:02:47.510609 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" podStartSLOduration=6.295382508 podStartE2EDuration="22.510573728s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.468735737 +0000 UTC m=+878.673904068" lastFinishedPulling="2025-09-29 10:02:44.683926967 +0000 UTC m=+894.889095288" observedRunningTime="2025-09-29 10:02:47.50562613 +0000 UTC m=+897.710794461" watchObservedRunningTime="2025-09-29 10:02:47.510573728 +0000 UTC m=+897.715742059" Sep 29 10:02:48 crc kubenswrapper[4891]: I0929 10:02:48.288682 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.044188 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-w2lkh" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.072527 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-qmk9v" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.386982 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-m5dl2" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.609371 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-2nk4k" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.625492 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-d45q4" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.732729 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-gxs5h" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.805715 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hk7zb" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.838412 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-zj5dm" Sep 29 10:02:56 crc kubenswrapper[4891]: I0929 10:02:56.868436 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-774b97b48-xlml4" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.403695 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" event={"ID":"40dffb60-1139-4864-b251-0aa8c145b66e","Type":"ContainerStarted","Data":"609724ba2a114bbe5b34f6dea9e3e216e77977f790c1375eb07f3b2e80cae0bb"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.404208 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.406331 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" event={"ID":"177d1c2e-3396-4516-aed4-31227f05abff","Type":"ContainerStarted","Data":"f27b017f5a1294bc0a436b106dc0d58fea75d20a2c87e772e32aef1e2309a635"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.406532 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.409137 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" event={"ID":"48c30870-804a-4f13-95f4-ec4a5a02b536","Type":"ContainerStarted","Data":"04a69385464f73362590f8fa9b662b5e1d42ea4a15e02c73fe16a9f7e36a8786"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.409326 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.411934 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" event={"ID":"59104851-7ccd-446a-9441-ef993caefd10","Type":"ContainerStarted","Data":"99b0096b53261be064fa464b7a739916219fa698ebb9cf7aff73d383af41f3d8"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.412186 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.414730 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" event={"ID":"31d92d3d-3a46-416c-b5f0-6fb12bb5bead","Type":"ContainerStarted","Data":"300d1f3f2a442cc307d3a1d112a05969e11dec844bfd7c247d3dc2ef6b081c9d"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.419243 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" event={"ID":"1ab4abbc-82b1-4624-856b-cbd9062184c0","Type":"ContainerStarted","Data":"c22d686ec4ce5f66c3dd8b657fccb6cc13729e1fca030f9c2a718a776c7c9bc6"} Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.419520 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.426158 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" podStartSLOduration=4.789636432 podStartE2EDuration="32.42613368s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.572572677 +0000 UTC m=+878.777740998" lastFinishedPulling="2025-09-29 10:02:56.209069925 +0000 UTC m=+906.414238246" observedRunningTime="2025-09-29 10:02:57.423826052 +0000 UTC m=+907.628994403" watchObservedRunningTime="2025-09-29 10:02:57.42613368 +0000 UTC m=+907.631302011" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.450948 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" podStartSLOduration=3.962488833 podStartE2EDuration="31.450924371s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.482634322 +0000 UTC m=+878.687802643" lastFinishedPulling="2025-09-29 10:02:55.97106986 +0000 UTC m=+906.176238181" observedRunningTime="2025-09-29 10:02:57.445261812 +0000 UTC m=+907.650430163" watchObservedRunningTime="2025-09-29 10:02:57.450924371 +0000 UTC m=+907.656092712" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.468209 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" podStartSLOduration=4.790036795 podStartE2EDuration="32.468184236s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.531255254 +0000 UTC m=+878.736423575" lastFinishedPulling="2025-09-29 10:02:56.209402695 +0000 UTC m=+906.414571016" observedRunningTime="2025-09-29 10:02:57.462084214 +0000 UTC m=+907.667252585" watchObservedRunningTime="2025-09-29 10:02:57.468184236 +0000 UTC m=+907.673352557" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.480056 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" podStartSLOduration=3.828252745 podStartE2EDuration="31.4800334s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.55726621 +0000 UTC m=+878.762434531" lastFinishedPulling="2025-09-29 10:02:56.209046855 +0000 UTC m=+906.414215186" observedRunningTime="2025-09-29 10:02:57.477120773 +0000 UTC m=+907.682289114" watchObservedRunningTime="2025-09-29 10:02:57.4800334 +0000 UTC m=+907.685201751" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.502639 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" podStartSLOduration=3.849238132 podStartE2EDuration="31.502617164s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.557823997 +0000 UTC m=+878.762992318" lastFinishedPulling="2025-09-29 10:02:56.211203029 +0000 UTC m=+906.416371350" observedRunningTime="2025-09-29 10:02:57.494702988 +0000 UTC m=+907.699871329" watchObservedRunningTime="2025-09-29 10:02:57.502617164 +0000 UTC m=+907.707785485" Sep 29 10:02:57 crc kubenswrapper[4891]: I0929 10:02:57.512859 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-8b4x9" podStartSLOduration=3.827996978 podStartE2EDuration="31.512838709s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.557840628 +0000 UTC m=+878.763008949" lastFinishedPulling="2025-09-29 10:02:56.242682359 +0000 UTC m=+906.447850680" observedRunningTime="2025-09-29 10:02:57.511311863 +0000 UTC m=+907.716480184" watchObservedRunningTime="2025-09-29 10:02:57.512838709 +0000 UTC m=+907.718007030" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.442535 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" event={"ID":"543e23f1-51b6-489d-91d8-b1550bb69680","Type":"ContainerStarted","Data":"b1f02bf1f16f8f2edeeafcfbb46b8208e9f6cf5537b2a28020742432f24a77ce"} Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.443326 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.445676 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" event={"ID":"348984e7-163d-4396-84f5-319eb4fc79fb","Type":"ContainerStarted","Data":"92396ffef3ceca1c443f1c29b0640cceb538a696121cf29a12481249eb1d4a1a"} Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.445956 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.448732 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" event={"ID":"9f51bd90-5b61-4cec-875e-d515cc501a22","Type":"ContainerStarted","Data":"175e80ffdfc3acc9f47aafdbe347e4ac4aa00b0fa57bb948ea7f69b87ef6700f"} Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.448949 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.451875 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" event={"ID":"539a685d-4cdf-4344-a7a3-448ec5e9ba6e","Type":"ContainerStarted","Data":"c8ca6749235d0aec495e4be8abbb856819117c666c47ef314f0d9fc166d6a5f8"} Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.452122 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.477733 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" podStartSLOduration=3.4049163829999998 podStartE2EDuration="34.477703137s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.917216961 +0000 UTC m=+878.122385272" lastFinishedPulling="2025-09-29 10:02:58.990003695 +0000 UTC m=+909.195172026" observedRunningTime="2025-09-29 10:02:59.472980728 +0000 UTC m=+909.678149059" watchObservedRunningTime="2025-09-29 10:02:59.477703137 +0000 UTC m=+909.682871458" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.493723 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" podStartSLOduration=3.770557591 podStartE2EDuration="34.493691457s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.469476599 +0000 UTC m=+878.674644920" lastFinishedPulling="2025-09-29 10:02:59.192610425 +0000 UTC m=+909.397778786" observedRunningTime="2025-09-29 10:02:59.490424441 +0000 UTC m=+909.695592762" watchObservedRunningTime="2025-09-29 10:02:59.493691457 +0000 UTC m=+909.698859778" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.514724 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" podStartSLOduration=3.057411384 podStartE2EDuration="33.514693853s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.46815918 +0000 UTC m=+878.673327501" lastFinishedPulling="2025-09-29 10:02:58.925441619 +0000 UTC m=+909.130609970" observedRunningTime="2025-09-29 10:02:59.513299672 +0000 UTC m=+909.718468023" watchObservedRunningTime="2025-09-29 10:02:59.514693853 +0000 UTC m=+909.719862194" Sep 29 10:02:59 crc kubenswrapper[4891]: I0929 10:02:59.540832 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" podStartSLOduration=3.082705698 podStartE2EDuration="33.54080643s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:28.46915098 +0000 UTC m=+878.674319301" lastFinishedPulling="2025-09-29 10:02:58.927251702 +0000 UTC m=+909.132420033" observedRunningTime="2025-09-29 10:02:59.53738597 +0000 UTC m=+909.742554311" watchObservedRunningTime="2025-09-29 10:02:59.54080643 +0000 UTC m=+909.745974761" Sep 29 10:03:00 crc kubenswrapper[4891]: I0929 10:03:00.465083 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" event={"ID":"b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31","Type":"ContainerStarted","Data":"b471bb155a91f8d6bc87b95016c1b43f3c45ff788dfa33b486fd428cc37eef12"} Sep 29 10:03:00 crc kubenswrapper[4891]: I0929 10:03:00.465866 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:03:00 crc kubenswrapper[4891]: I0929 10:03:00.469206 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" event={"ID":"bddd647a-c213-41dd-9f22-3cef16c4622b","Type":"ContainerStarted","Data":"f564e5af8969f6be2e7d6c429495429b142d60cae3ffa7cf656a8c17d79e369f"} Sep 29 10:03:00 crc kubenswrapper[4891]: I0929 10:03:00.492539 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" podStartSLOduration=2.898716393 podStartE2EDuration="35.492490969s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.433566422 +0000 UTC m=+877.638734743" lastFinishedPulling="2025-09-29 10:03:00.027340978 +0000 UTC m=+910.232509319" observedRunningTime="2025-09-29 10:03:00.485979828 +0000 UTC m=+910.691148169" watchObservedRunningTime="2025-09-29 10:03:00.492490969 +0000 UTC m=+910.697659300" Sep 29 10:03:00 crc kubenswrapper[4891]: I0929 10:03:00.514436 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" podStartSLOduration=3.044392364 podStartE2EDuration="35.514414483s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:27.645896931 +0000 UTC m=+877.851065252" lastFinishedPulling="2025-09-29 10:03:00.11591904 +0000 UTC m=+910.321087371" observedRunningTime="2025-09-29 10:03:00.512842367 +0000 UTC m=+910.718010698" watchObservedRunningTime="2025-09-29 10:03:00.514414483 +0000 UTC m=+910.719582804" Sep 29 10:03:02 crc kubenswrapper[4891]: I0929 10:03:02.499817 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" event={"ID":"51a34f9a-d71a-45d0-9a76-01d629fc7d79","Type":"ContainerStarted","Data":"cec672b86dc20caa27a95f84d71ba30e52208c1a60dafd6bcfaaed63649af0ba"} Sep 29 10:03:02 crc kubenswrapper[4891]: I0929 10:03:02.500595 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:03:02 crc kubenswrapper[4891]: I0929 10:03:02.545118 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" podStartSLOduration=3.902967378 podStartE2EDuration="36.545080877s" podCreationTimestamp="2025-09-29 10:02:26 +0000 UTC" firstStartedPulling="2025-09-29 10:02:29.251595609 +0000 UTC m=+879.456763920" lastFinishedPulling="2025-09-29 10:03:01.893709068 +0000 UTC m=+912.098877419" observedRunningTime="2025-09-29 10:03:02.537302709 +0000 UTC m=+912.742471070" watchObservedRunningTime="2025-09-29 10:03:02.545080877 +0000 UTC m=+912.750249208" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.148339 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8ff95898-x2qwd" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.274205 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.277926 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-8xchz" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.420429 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-r4w8j" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.472434 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-lwsw4" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.609542 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-6jllh" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.705080 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-f7t8j" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.779702 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-t4j4f" Sep 29 10:03:06 crc kubenswrapper[4891]: I0929 10:03:06.907984 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-9xmvx" Sep 29 10:03:07 crc kubenswrapper[4891]: I0929 10:03:07.075380 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-8kqf5" Sep 29 10:03:07 crc kubenswrapper[4891]: I0929 10:03:07.156104 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-r2lgd" Sep 29 10:03:07 crc kubenswrapper[4891]: I0929 10:03:07.173523 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-q67kx" Sep 29 10:03:08 crc kubenswrapper[4891]: I0929 10:03:08.438196 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-f5bh2" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.881367 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882390 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882404 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882420 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="extract-content" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882426 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="extract-content" Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882448 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="extract-content" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882455 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="extract-content" Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882463 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="extract-utilities" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882469 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="extract-utilities" Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882489 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="extract-utilities" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882498 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="extract-utilities" Sep 29 10:03:24 crc kubenswrapper[4891]: E0929 10:03:24.882534 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882541 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882682 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6488ebe-940b-4de5-8549-f0c903cae2de" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.882705 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb7270e-4994-471c-bb92-356bf4d3708b" containerName="registry-server" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.883589 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.886981 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.887156 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qsqkb" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.887463 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.890163 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.910760 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.942631 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.959403 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.966058 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 29 10:03:24 crc kubenswrapper[4891]: I0929 10:03:24.967195 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.017981 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.018095 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzkc\" (UniqueName: \"kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.119809 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.120213 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.120261 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqq9q\" (UniqueName: \"kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.120301 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.121055 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzkc\" (UniqueName: \"kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.121674 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.142775 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzkc\" (UniqueName: \"kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc\") pod \"dnsmasq-dns-675f4bcbfc-pjm79\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.202983 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.222847 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.222909 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.222954 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqq9q\" (UniqueName: \"kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.224155 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.224241 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.248136 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqq9q\" (UniqueName: \"kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q\") pod \"dnsmasq-dns-78dd6ddcc-l62kp\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.279905 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.564779 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.571937 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.668404 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:25 crc kubenswrapper[4891]: W0929 10:03:25.671371 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f192589_5b85_445d_b39f_3e3ef03bd932.slice/crio-33615884b21add34046a94d0eda403defd858f456d2fd0f3d4c3babcb04fef9d WatchSource:0}: Error finding container 33615884b21add34046a94d0eda403defd858f456d2fd0f3d4c3babcb04fef9d: Status 404 returned error can't find the container with id 33615884b21add34046a94d0eda403defd858f456d2fd0f3d4c3babcb04fef9d Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.698896 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" event={"ID":"876dc782-d709-4ad3-a7c0-1dd7bd42e358","Type":"ContainerStarted","Data":"02769ad68b7d5c2c24c3cd24195e74f9b0848bcfafeaffe7f640a3ca174885bd"} Sep 29 10:03:25 crc kubenswrapper[4891]: I0929 10:03:25.700648 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" event={"ID":"4f192589-5b85-445d-b39f-3e3ef03bd932","Type":"ContainerStarted","Data":"33615884b21add34046a94d0eda403defd858f456d2fd0f3d4c3babcb04fef9d"} Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.609711 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.650895 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.652437 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.669547 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfpcm\" (UniqueName: \"kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.669615 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.669683 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.680103 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.770890 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.771256 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfpcm\" (UniqueName: \"kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.771349 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.772191 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.772280 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.800737 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfpcm\" (UniqueName: \"kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm\") pod \"dnsmasq-dns-666b6646f7-snz4k\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:27 crc kubenswrapper[4891]: I0929 10:03:27.973868 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.026992 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.059924 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.061277 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.076869 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.076978 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.077008 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdjm\" (UniqueName: \"kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.079282 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.178176 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.178223 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkdjm\" (UniqueName: \"kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.178310 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.179380 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.179896 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.229896 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkdjm\" (UniqueName: \"kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm\") pod \"dnsmasq-dns-57d769cc4f-mtbzs\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.384184 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.518964 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.734637 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" event={"ID":"d9326b29-45ea-4953-ab07-fe1812d0fdde","Type":"ContainerStarted","Data":"9ca049e05351e5c8386119dac7d33b86ceb9b96e58c37fb68725848c381bf717"} Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.926896 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:03:28 crc kubenswrapper[4891]: W0929 10:03:28.939555 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5435582b_ebc3_42e5_8a85_65eef8fe8263.slice/crio-ece187dcc289e34fd72d355309f3b66d2808ef7394f7268c7fd0ec0e621f551b WatchSource:0}: Error finding container ece187dcc289e34fd72d355309f3b66d2808ef7394f7268c7fd0ec0e621f551b: Status 404 returned error can't find the container with id ece187dcc289e34fd72d355309f3b66d2808ef7394f7268c7fd0ec0e621f551b Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.979608 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.981469 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.985122 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-n25k7" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.985429 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.985767 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.986016 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.987590 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.988029 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.987613 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 10:03:28 crc kubenswrapper[4891]: I0929 10:03:28.995381 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.097781 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098016 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098091 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098190 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098277 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098480 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.098674 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.099163 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.099295 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.099355 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.099378 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8w5\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201055 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201180 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201208 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8w5\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201255 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201317 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201352 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201439 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201479 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201509 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.201551 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.202465 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.202525 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.203066 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.203111 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.203132 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.203351 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.203603 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.209053 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.211876 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.216032 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.225832 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8w5\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.226235 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.227954 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.252921 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.257173 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.260375 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.261751 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.262013 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.262415 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.262504 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.262418 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.262664 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zdhqs" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.272258 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.322016 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.406710 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286m5\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.406762 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.406808 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407134 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407345 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407438 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407659 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407701 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407727 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407753 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.407841 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509810 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509857 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509906 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509924 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509948 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.509989 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-286m5\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510025 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510069 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510104 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510123 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.510910 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.511336 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.511428 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.512256 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.513120 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.514031 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.524228 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.526082 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.529324 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.537120 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-286m5\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.542261 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.542573 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.596102 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.751513 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" event={"ID":"5435582b-ebc3-42e5-8a85-65eef8fe8263","Type":"ContainerStarted","Data":"ece187dcc289e34fd72d355309f3b66d2808ef7394f7268c7fd0ec0e621f551b"} Sep 29 10:03:29 crc kubenswrapper[4891]: I0929 10:03:29.874717 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:03:30 crc kubenswrapper[4891]: I0929 10:03:30.290005 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:03:30 crc kubenswrapper[4891]: W0929 10:03:30.305151 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd6ea18_7472_42de_b949_140181cd55a5.slice/crio-fdcd1106271e9bcbcc843cd1a795129e21db691bfd022d93711685abd5346252 WatchSource:0}: Error finding container fdcd1106271e9bcbcc843cd1a795129e21db691bfd022d93711685abd5346252: Status 404 returned error can't find the container with id fdcd1106271e9bcbcc843cd1a795129e21db691bfd022d93711685abd5346252 Sep 29 10:03:30 crc kubenswrapper[4891]: I0929 10:03:30.762643 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerStarted","Data":"fdcd1106271e9bcbcc843cd1a795129e21db691bfd022d93711685abd5346252"} Sep 29 10:03:30 crc kubenswrapper[4891]: I0929 10:03:30.766450 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerStarted","Data":"c9920d452ea798d44cfe1486dec0f524944a0e058a65584caa3de2f3fe567fdb"} Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.264824 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.268444 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.278002 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.280272 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.282674 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.286406 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.287531 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pjnqb" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.288005 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.292437 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351256 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351375 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351449 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-kolla-config\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351480 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351679 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-secrets\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351851 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-default\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.351926 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.352096 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.352581 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czlk7\" (UniqueName: \"kubernetes.io/projected/26add200-3f00-406b-8d30-565e1e51fbd3-kube-api-access-czlk7\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454054 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-secrets\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454122 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-default\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454157 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454188 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454240 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czlk7\" (UniqueName: \"kubernetes.io/projected/26add200-3f00-406b-8d30-565e1e51fbd3-kube-api-access-czlk7\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454265 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454298 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454331 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-kolla-config\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.454350 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.455201 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-default\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.455613 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.455768 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26add200-3f00-406b-8d30-565e1e51fbd3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.456383 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.457096 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26add200-3f00-406b-8d30-565e1e51fbd3-kolla-config\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.468025 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.468471 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.479519 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.482496 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czlk7\" (UniqueName: \"kubernetes.io/projected/26add200-3f00-406b-8d30-565e1e51fbd3-kube-api-access-czlk7\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.492843 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/26add200-3f00-406b-8d30-565e1e51fbd3-secrets\") pod \"openstack-galera-0\" (UID: \"26add200-3f00-406b-8d30-565e1e51fbd3\") " pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.605769 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.664412 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.667172 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.669878 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.670218 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.670559 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.671318 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k8k7n" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.681982 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759358 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759414 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xxj\" (UniqueName: \"kubernetes.io/projected/71a06463-0c24-4e9a-a4e7-4b0143207f46-kube-api-access-96xxj\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759436 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759464 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759483 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759505 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759547 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759582 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.759606 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863299 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863362 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863395 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863430 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xxj\" (UniqueName: \"kubernetes.io/projected/71a06463-0c24-4e9a-a4e7-4b0143207f46-kube-api-access-96xxj\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863452 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863479 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863496 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863516 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.863566 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.864479 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.865623 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.865671 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.866073 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.869932 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.874222 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.882735 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71a06463-0c24-4e9a-a4e7-4b0143207f46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.885349 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71a06463-0c24-4e9a-a4e7-4b0143207f46-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.890115 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xxj\" (UniqueName: \"kubernetes.io/projected/71a06463-0c24-4e9a-a4e7-4b0143207f46-kube-api-access-96xxj\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:31 crc kubenswrapper[4891]: I0929 10:03:31.929581 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71a06463-0c24-4e9a-a4e7-4b0143207f46\") " pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.058185 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.105187 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.170875 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.171995 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.176571 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-h7jn4" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.176819 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.176828 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.190123 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.278175 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.278232 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrpw\" (UniqueName: \"kubernetes.io/projected/9ec260f8-616d-4e46-8685-0dcabdf10a16-kube-api-access-2lrpw\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.278708 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-config-data\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.278835 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.279114 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-kolla-config\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: W0929 10:03:32.298195 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26add200_3f00_406b_8d30_565e1e51fbd3.slice/crio-94ad0c7bb91b36ea450d48602b6af7267cbfeb31cbd08ea559507fbfba1ebec2 WatchSource:0}: Error finding container 94ad0c7bb91b36ea450d48602b6af7267cbfeb31cbd08ea559507fbfba1ebec2: Status 404 returned error can't find the container with id 94ad0c7bb91b36ea450d48602b6af7267cbfeb31cbd08ea559507fbfba1ebec2 Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.380367 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-kolla-config\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.380446 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.380476 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrpw\" (UniqueName: \"kubernetes.io/projected/9ec260f8-616d-4e46-8685-0dcabdf10a16-kube-api-access-2lrpw\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.380529 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-config-data\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.380546 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.383561 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-kolla-config\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.383770 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ec260f8-616d-4e46-8685-0dcabdf10a16-config-data\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.389071 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.390557 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ec260f8-616d-4e46-8685-0dcabdf10a16-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.406305 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrpw\" (UniqueName: \"kubernetes.io/projected/9ec260f8-616d-4e46-8685-0dcabdf10a16-kube-api-access-2lrpw\") pod \"memcached-0\" (UID: \"9ec260f8-616d-4e46-8685-0dcabdf10a16\") " pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.510753 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.786853 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26add200-3f00-406b-8d30-565e1e51fbd3","Type":"ContainerStarted","Data":"94ad0c7bb91b36ea450d48602b6af7267cbfeb31cbd08ea559507fbfba1ebec2"} Sep 29 10:03:32 crc kubenswrapper[4891]: I0929 10:03:32.845831 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 10:03:32 crc kubenswrapper[4891]: W0929 10:03:32.850976 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a06463_0c24_4e9a_a4e7_4b0143207f46.slice/crio-b43c54f0173344fbd71b6c28ef581fb6f10a58c8eaab0a5d3e8632b02b354b42 WatchSource:0}: Error finding container b43c54f0173344fbd71b6c28ef581fb6f10a58c8eaab0a5d3e8632b02b354b42: Status 404 returned error can't find the container with id b43c54f0173344fbd71b6c28ef581fb6f10a58c8eaab0a5d3e8632b02b354b42 Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.182854 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 10:03:33 crc kubenswrapper[4891]: W0929 10:03:33.210889 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ec260f8_616d_4e46_8685_0dcabdf10a16.slice/crio-f59d6d62b23eead78bc3cb83e8d729e2b8b0cf1d5a4ef353a17965c95e0ab01c WatchSource:0}: Error finding container f59d6d62b23eead78bc3cb83e8d729e2b8b0cf1d5a4ef353a17965c95e0ab01c: Status 404 returned error can't find the container with id f59d6d62b23eead78bc3cb83e8d729e2b8b0cf1d5a4ef353a17965c95e0ab01c Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.822731 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9ec260f8-616d-4e46-8685-0dcabdf10a16","Type":"ContainerStarted","Data":"f59d6d62b23eead78bc3cb83e8d729e2b8b0cf1d5a4ef353a17965c95e0ab01c"} Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.833234 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71a06463-0c24-4e9a-a4e7-4b0143207f46","Type":"ContainerStarted","Data":"b43c54f0173344fbd71b6c28ef581fb6f10a58c8eaab0a5d3e8632b02b354b42"} Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.970215 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.971519 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.975877 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:33 crc kubenswrapper[4891]: I0929 10:03:33.978422 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rrs5s" Sep 29 10:03:34 crc kubenswrapper[4891]: I0929 10:03:34.056253 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg69z\" (UniqueName: \"kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z\") pod \"kube-state-metrics-0\" (UID: \"d268201e-fc68-403d-958c-2b402143c96e\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:34 crc kubenswrapper[4891]: I0929 10:03:34.158553 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg69z\" (UniqueName: \"kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z\") pod \"kube-state-metrics-0\" (UID: \"d268201e-fc68-403d-958c-2b402143c96e\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:34 crc kubenswrapper[4891]: I0929 10:03:34.181428 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg69z\" (UniqueName: \"kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z\") pod \"kube-state-metrics-0\" (UID: \"d268201e-fc68-403d-958c-2b402143c96e\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:34 crc kubenswrapper[4891]: I0929 10:03:34.366784 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:35 crc kubenswrapper[4891]: I0929 10:03:35.002030 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.185825 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.185891 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.985497 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jq4xk"] Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.987198 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.991333 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ccwlj" Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.991780 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 29 10:03:36 crc kubenswrapper[4891]: I0929 10:03:36.992312 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.005437 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vxc7p"] Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.008125 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.013925 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jq4xk"] Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018653 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-log-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018726 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-combined-ca-bundle\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018775 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7484acb7-f4b2-417b-a478-86b8c5999c34-scripts\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018927 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018961 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.018997 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9fk9\" (UniqueName: \"kubernetes.io/projected/7484acb7-f4b2-417b-a478-86b8c5999c34-kube-api-access-q9fk9\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.019032 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-ovn-controller-tls-certs\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.020162 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vxc7p"] Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.148781 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9fk9\" (UniqueName: \"kubernetes.io/projected/7484acb7-f4b2-417b-a478-86b8c5999c34-kube-api-access-q9fk9\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.148881 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-ovn-controller-tls-certs\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.148977 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-log-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.148999 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-combined-ca-bundle\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.149022 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7484acb7-f4b2-417b-a478-86b8c5999c34-scripts\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.149054 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.149090 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.149731 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.150192 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-log-ovn\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.151042 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7484acb7-f4b2-417b-a478-86b8c5999c34-var-run\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.159701 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-ovn-controller-tls-certs\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.173816 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484acb7-f4b2-417b-a478-86b8c5999c34-combined-ca-bundle\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.175415 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7484acb7-f4b2-417b-a478-86b8c5999c34-scripts\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.180846 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9fk9\" (UniqueName: \"kubernetes.io/projected/7484acb7-f4b2-417b-a478-86b8c5999c34-kube-api-access-q9fk9\") pod \"ovn-controller-jq4xk\" (UID: \"7484acb7-f4b2-417b-a478-86b8c5999c34\") " pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.220315 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.250918 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1fc0a48-e2b3-479b-948c-ff2279a7205c-scripts\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.250977 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-lib\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.251006 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-log\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.251503 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27f6\" (UniqueName: \"kubernetes.io/projected/c1fc0a48-e2b3-479b-948c-ff2279a7205c-kube-api-access-k27f6\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.251547 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-etc-ovs\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.251946 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-run\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.354402 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1fc0a48-e2b3-479b-948c-ff2279a7205c-scripts\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.356980 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1fc0a48-e2b3-479b-948c-ff2279a7205c-scripts\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.357058 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-lib\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.357351 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-lib\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.357410 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-log\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.357518 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-log\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.357933 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27f6\" (UniqueName: \"kubernetes.io/projected/c1fc0a48-e2b3-479b-948c-ff2279a7205c-kube-api-access-k27f6\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.358002 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-etc-ovs\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.358175 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-etc-ovs\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.358484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-run\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.358576 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1fc0a48-e2b3-479b-948c-ff2279a7205c-var-run\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.397618 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27f6\" (UniqueName: \"kubernetes.io/projected/c1fc0a48-e2b3-479b-948c-ff2279a7205c-kube-api-access-k27f6\") pod \"ovn-controller-ovs-vxc7p\" (UID: \"c1fc0a48-e2b3-479b-948c-ff2279a7205c\") " pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:37 crc kubenswrapper[4891]: I0929 10:03:37.531325 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:03:38 crc kubenswrapper[4891]: W0929 10:03:38.701003 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd268201e_fc68_403d_958c_2b402143c96e.slice/crio-6883d682bfb090e2f87bcfc7dc96bc5715d0aff173ffb529590ffadb24083203 WatchSource:0}: Error finding container 6883d682bfb090e2f87bcfc7dc96bc5715d0aff173ffb529590ffadb24083203: Status 404 returned error can't find the container with id 6883d682bfb090e2f87bcfc7dc96bc5715d0aff173ffb529590ffadb24083203 Sep 29 10:03:38 crc kubenswrapper[4891]: I0929 10:03:38.885019 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d268201e-fc68-403d-958c-2b402143c96e","Type":"ContainerStarted","Data":"6883d682bfb090e2f87bcfc7dc96bc5715d0aff173ffb529590ffadb24083203"} Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.737013 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.739557 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.742538 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.742698 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bxwz8" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.742713 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.742724 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.745904 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.771215 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901193 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901264 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901290 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901337 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901361 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901612 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmpx\" (UniqueName: \"kubernetes.io/projected/807cd996-3d20-4f16-b5bb-3b4e4da82775-kube-api-access-zcmpx\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901677 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:39 crc kubenswrapper[4891]: I0929 10:03:39.901775 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-config\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.003922 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004016 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-config\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004043 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004087 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004106 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004148 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004180 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004227 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmpx\" (UniqueName: \"kubernetes.io/projected/807cd996-3d20-4f16-b5bb-3b4e4da82775-kube-api-access-zcmpx\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.004968 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.005024 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.005329 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-config\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.005587 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807cd996-3d20-4f16-b5bb-3b4e4da82775-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.011925 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.012133 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.025178 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807cd996-3d20-4f16-b5bb-3b4e4da82775-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.027107 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmpx\" (UniqueName: \"kubernetes.io/projected/807cd996-3d20-4f16-b5bb-3b4e4da82775-kube-api-access-zcmpx\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.034004 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"807cd996-3d20-4f16-b5bb-3b4e4da82775\") " pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:40 crc kubenswrapper[4891]: I0929 10:03:40.089210 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.312843 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.314706 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.323307 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.324466 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ndjwz" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.324506 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.324708 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.331600 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.430284 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2mz\" (UniqueName: \"kubernetes.io/projected/c6033a7a-34ac-409d-ab81-035b291364aa-kube-api-access-kr2mz\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.430547 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.431087 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.431298 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.431317 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.431369 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.431908 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.432011 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-config\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534500 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534584 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-config\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534657 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2mz\" (UniqueName: \"kubernetes.io/projected/c6033a7a-34ac-409d-ab81-035b291364aa-kube-api-access-kr2mz\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534696 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534730 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534763 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534801 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.534878 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.536636 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.538021 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.535269 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.538603 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6033a7a-34ac-409d-ab81-035b291364aa-config\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.541611 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.541773 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.541915 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6033a7a-34ac-409d-ab81-035b291364aa-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.566774 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.567165 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2mz\" (UniqueName: \"kubernetes.io/projected/c6033a7a-34ac-409d-ab81-035b291364aa-kube-api-access-kr2mz\") pod \"ovsdbserver-sb-0\" (UID: \"c6033a7a-34ac-409d-ab81-035b291364aa\") " pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:41 crc kubenswrapper[4891]: I0929 10:03:41.660309 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 10:03:47 crc kubenswrapper[4891]: E0929 10:03:47.076965 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Sep 29 10:03:47 crc kubenswrapper[4891]: E0929 10:03:47.078141 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-286m5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(8fd6ea18-7472-42de-b949-140181cd55a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:47 crc kubenswrapper[4891]: E0929 10:03:47.080027 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" Sep 29 10:03:47 crc kubenswrapper[4891]: E0929 10:03:47.974384 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" Sep 29 10:03:55 crc kubenswrapper[4891]: E0929 10:03:55.998372 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:55.999518 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96xxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(71a06463-0c24-4e9a-a4e7-4b0143207f46): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.001140 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="71a06463-0c24-4e9a-a4e7-4b0143207f46" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.013081 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.013760 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gp8w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(f895d522-5026-4d72-862e-1a2b1bd5ee3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.015096 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.043862 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.751923 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.752198 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkdjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-mtbzs_openstack(5435582b-ebc3-42e5-8a85-65eef8fe8263): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.753434 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" podUID="5435582b-ebc3-42e5-8a85-65eef8fe8263" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.754571 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.754910 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfpcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-snz4k_openstack(d9326b29-45ea-4953-ab07-fe1812d0fdde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.756179 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" podUID="d9326b29-45ea-4953-ab07-fe1812d0fdde" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.762565 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.762844 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfzkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-pjm79_openstack(4f192589-5b85-445d-b39f-3e3ef03bd932): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.764000 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" podUID="4f192589-5b85-445d-b39f-3e3ef03bd932" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.794497 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.794759 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqq9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-l62kp_openstack(876dc782-d709-4ad3-a7c0-1dd7bd42e358): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:03:56 crc kubenswrapper[4891]: E0929 10:03:56.795949 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" podUID="876dc782-d709-4ad3-a7c0-1dd7bd42e358" Sep 29 10:03:57 crc kubenswrapper[4891]: E0929 10:03:57.053465 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" podUID="5435582b-ebc3-42e5-8a85-65eef8fe8263" Sep 29 10:03:57 crc kubenswrapper[4891]: E0929 10:03:57.082688 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" podUID="d9326b29-45ea-4953-ab07-fe1812d0fdde" Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.240164 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jq4xk"] Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.356662 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.459550 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vxc7p"] Sep 29 10:03:57 crc kubenswrapper[4891]: W0929 10:03:57.549885 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1fc0a48_e2b3_479b_948c_ff2279a7205c.slice/crio-57be7ecc9d57c937c92c2e3852cffe52f2c691cd9c01dc76c92849078f9d5c8c WatchSource:0}: Error finding container 57be7ecc9d57c937c92c2e3852cffe52f2c691cd9c01dc76c92849078f9d5c8c: Status 404 returned error can't find the container with id 57be7ecc9d57c937c92c2e3852cffe52f2c691cd9c01dc76c92849078f9d5c8c Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.589661 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.770711 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:57 crc kubenswrapper[4891]: W0929 10:03:57.825106 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807cd996_3d20_4f16_b5bb_3b4e4da82775.slice/crio-000af5f3b98161ae7a4d1a8f976693eed81988f204e3df1b939d7e1c1e41fc5e WatchSource:0}: Error finding container 000af5f3b98161ae7a4d1a8f976693eed81988f204e3df1b939d7e1c1e41fc5e: Status 404 returned error can't find the container with id 000af5f3b98161ae7a4d1a8f976693eed81988f204e3df1b939d7e1c1e41fc5e Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.904880 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.937254 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config\") pod \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.937376 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqq9q\" (UniqueName: \"kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q\") pod \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.937402 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc\") pod \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\" (UID: \"876dc782-d709-4ad3-a7c0-1dd7bd42e358\") " Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.938409 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "876dc782-d709-4ad3-a7c0-1dd7bd42e358" (UID: "876dc782-d709-4ad3-a7c0-1dd7bd42e358"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.939336 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config" (OuterVolumeSpecName: "config") pod "876dc782-d709-4ad3-a7c0-1dd7bd42e358" (UID: "876dc782-d709-4ad3-a7c0-1dd7bd42e358"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4891]: I0929 10:03:57.948611 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q" (OuterVolumeSpecName: "kube-api-access-zqq9q") pod "876dc782-d709-4ad3-a7c0-1dd7bd42e358" (UID: "876dc782-d709-4ad3-a7c0-1dd7bd42e358"). InnerVolumeSpecName "kube-api-access-zqq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.039554 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config\") pod \"4f192589-5b85-445d-b39f-3e3ef03bd932\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.040071 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config" (OuterVolumeSpecName: "config") pod "4f192589-5b85-445d-b39f-3e3ef03bd932" (UID: "4f192589-5b85-445d-b39f-3e3ef03bd932"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.040142 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkc\" (UniqueName: \"kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc\") pod \"4f192589-5b85-445d-b39f-3e3ef03bd932\" (UID: \"4f192589-5b85-445d-b39f-3e3ef03bd932\") " Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.043496 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f192589-5b85-445d-b39f-3e3ef03bd932-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.043549 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.043569 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqq9q\" (UniqueName: \"kubernetes.io/projected/876dc782-d709-4ad3-a7c0-1dd7bd42e358-kube-api-access-zqq9q\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.043585 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/876dc782-d709-4ad3-a7c0-1dd7bd42e358-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.046268 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc" (OuterVolumeSpecName: "kube-api-access-mfzkc") pod "4f192589-5b85-445d-b39f-3e3ef03bd932" (UID: "4f192589-5b85-445d-b39f-3e3ef03bd932"). InnerVolumeSpecName "kube-api-access-mfzkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.081875 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jq4xk" event={"ID":"7484acb7-f4b2-417b-a478-86b8c5999c34","Type":"ContainerStarted","Data":"f9523e2b7b3dfed9f734555e1ef103b870bd4179e6b24476ebc396a32ef71f6b"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.085599 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9ec260f8-616d-4e46-8685-0dcabdf10a16","Type":"ContainerStarted","Data":"61a6fd3991eb6d108c4d9f0130a30e21cb28f543d065dadfe68da77a65a778bc"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.085667 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.086919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" event={"ID":"876dc782-d709-4ad3-a7c0-1dd7bd42e358","Type":"ContainerDied","Data":"02769ad68b7d5c2c24c3cd24195e74f9b0848bcfafeaffe7f640a3ca174885bd"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.087032 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l62kp" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.090100 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vxc7p" event={"ID":"c1fc0a48-e2b3-479b-948c-ff2279a7205c","Type":"ContainerStarted","Data":"57be7ecc9d57c937c92c2e3852cffe52f2c691cd9c01dc76c92849078f9d5c8c"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.094345 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71a06463-0c24-4e9a-a4e7-4b0143207f46","Type":"ContainerStarted","Data":"31e4152ce3c5caff32e1310bd85cd2881c1c7b1d585f5618e270453b31b023fe"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.098027 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26add200-3f00-406b-8d30-565e1e51fbd3","Type":"ContainerStarted","Data":"cd7aca78aea76671fe6a06caf40053c3e852cd4490584e327391c7b48e56f3c8"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.099108 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c6033a7a-34ac-409d-ab81-035b291364aa","Type":"ContainerStarted","Data":"96779e3603d2fc396bfa02065950ebcf50253f7234f5de78433b9e9a48ec7862"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.100045 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"807cd996-3d20-4f16-b5bb-3b4e4da82775","Type":"ContainerStarted","Data":"000af5f3b98161ae7a4d1a8f976693eed81988f204e3df1b939d7e1c1e41fc5e"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.104346 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.104325 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pjm79" event={"ID":"4f192589-5b85-445d-b39f-3e3ef03bd932","Type":"ContainerDied","Data":"33615884b21add34046a94d0eda403defd858f456d2fd0f3d4c3babcb04fef9d"} Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.120167 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.629280149 podStartE2EDuration="26.120142678s" podCreationTimestamp="2025-09-29 10:03:32 +0000 UTC" firstStartedPulling="2025-09-29 10:03:33.220985692 +0000 UTC m=+943.426154013" lastFinishedPulling="2025-09-29 10:03:56.711848211 +0000 UTC m=+966.917016542" observedRunningTime="2025-09-29 10:03:58.113751171 +0000 UTC m=+968.318919482" watchObservedRunningTime="2025-09-29 10:03:58.120142678 +0000 UTC m=+968.325310989" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.145371 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfzkc\" (UniqueName: \"kubernetes.io/projected/4f192589-5b85-445d-b39f-3e3ef03bd932-kube-api-access-mfzkc\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.180063 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.186555 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l62kp"] Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.239393 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.245189 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pjm79"] Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.412838 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f192589-5b85-445d-b39f-3e3ef03bd932" path="/var/lib/kubelet/pods/4f192589-5b85-445d-b39f-3e3ef03bd932/volumes" Sep 29 10:03:58 crc kubenswrapper[4891]: I0929 10:03:58.413345 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876dc782-d709-4ad3-a7c0-1dd7bd42e358" path="/var/lib/kubelet/pods/876dc782-d709-4ad3-a7c0-1dd7bd42e358/volumes" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.122188 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d268201e-fc68-403d-958c-2b402143c96e","Type":"ContainerStarted","Data":"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1"} Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.122611 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.148821 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.916224393 podStartE2EDuration="27.148777503s" podCreationTimestamp="2025-09-29 10:03:33 +0000 UTC" firstStartedPulling="2025-09-29 10:03:38.714826651 +0000 UTC m=+948.919994982" lastFinishedPulling="2025-09-29 10:03:58.947379771 +0000 UTC m=+969.152548092" observedRunningTime="2025-09-29 10:04:00.137645556 +0000 UTC m=+970.342813877" watchObservedRunningTime="2025-09-29 10:04:00.148777503 +0000 UTC m=+970.353945824" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.371973 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-czcgf"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.373519 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.376478 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.386881 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-czcgf"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491057 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-combined-ca-bundle\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491113 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovs-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491163 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx4jh\" (UniqueName: \"kubernetes.io/projected/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-kube-api-access-gx4jh\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491208 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovn-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491243 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.491264 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-config\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.569652 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.599365 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.599874 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-config\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.600158 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-combined-ca-bundle\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.600235 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovs-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.600308 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx4jh\" (UniqueName: \"kubernetes.io/projected/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-kube-api-access-gx4jh\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.600419 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovn-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.602939 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovs-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.603537 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-config\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.619612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-ovn-rundir\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.619807 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.619900 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-combined-ca-bundle\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.630227 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx4jh\" (UniqueName: \"kubernetes.io/projected/aeab10b4-2f08-4eed-88eb-ba6f26db6cd0-kube-api-access-gx4jh\") pod \"ovn-controller-metrics-czcgf\" (UID: \"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0\") " pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.661578 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.663713 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.667280 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.674430 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.701020 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-czcgf" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.721725 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.721885 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6q5\" (UniqueName: \"kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.721925 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.721979 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.744813 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.784408 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.796106 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.801188 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.823804 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.823902 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.823976 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6q5\" (UniqueName: \"kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.824006 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.824990 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.825049 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.825403 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.834679 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.866308 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6q5\" (UniqueName: \"kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5\") pod \"dnsmasq-dns-5bf47b49b7-2cgdn\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.925426 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.926196 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.926250 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4zt\" (UniqueName: \"kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.926277 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:00 crc kubenswrapper[4891]: I0929 10:04:00.926968 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.002562 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.029115 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.029161 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4zt\" (UniqueName: \"kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.029186 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.029253 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.029306 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.030331 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.030917 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.032475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.032567 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.056657 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4zt\" (UniqueName: \"kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt\") pod \"dnsmasq-dns-8554648995-q75rz\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.136374 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.188265 4891 generic.go:334] "Generic (PLEG): container finished" podID="26add200-3f00-406b-8d30-565e1e51fbd3" containerID="cd7aca78aea76671fe6a06caf40053c3e852cd4490584e327391c7b48e56f3c8" exitCode=0 Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.189251 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26add200-3f00-406b-8d30-565e1e51fbd3","Type":"ContainerDied","Data":"cd7aca78aea76671fe6a06caf40053c3e852cd4490584e327391c7b48e56f3c8"} Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.192118 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.239619 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkdjm\" (UniqueName: \"kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm\") pod \"5435582b-ebc3-42e5-8a85-65eef8fe8263\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.239771 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config\") pod \"5435582b-ebc3-42e5-8a85-65eef8fe8263\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.239856 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc\") pod \"5435582b-ebc3-42e5-8a85-65eef8fe8263\" (UID: \"5435582b-ebc3-42e5-8a85-65eef8fe8263\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.242425 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5435582b-ebc3-42e5-8a85-65eef8fe8263" (UID: "5435582b-ebc3-42e5-8a85-65eef8fe8263"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.242702 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config" (OuterVolumeSpecName: "config") pod "5435582b-ebc3-42e5-8a85-65eef8fe8263" (UID: "5435582b-ebc3-42e5-8a85-65eef8fe8263"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.253130 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm" (OuterVolumeSpecName: "kube-api-access-hkdjm") pod "5435582b-ebc3-42e5-8a85-65eef8fe8263" (UID: "5435582b-ebc3-42e5-8a85-65eef8fe8263"). InnerVolumeSpecName "kube-api-access-hkdjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.342616 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkdjm\" (UniqueName: \"kubernetes.io/projected/5435582b-ebc3-42e5-8a85-65eef8fe8263-kube-api-access-hkdjm\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.342657 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.342666 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5435582b-ebc3-42e5-8a85-65eef8fe8263-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.868758 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.953560 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config\") pod \"d9326b29-45ea-4953-ab07-fe1812d0fdde\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.953621 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc\") pod \"d9326b29-45ea-4953-ab07-fe1812d0fdde\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.953679 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfpcm\" (UniqueName: \"kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm\") pod \"d9326b29-45ea-4953-ab07-fe1812d0fdde\" (UID: \"d9326b29-45ea-4953-ab07-fe1812d0fdde\") " Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.954257 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config" (OuterVolumeSpecName: "config") pod "d9326b29-45ea-4953-ab07-fe1812d0fdde" (UID: "d9326b29-45ea-4953-ab07-fe1812d0fdde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.954533 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9326b29-45ea-4953-ab07-fe1812d0fdde" (UID: "d9326b29-45ea-4953-ab07-fe1812d0fdde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:01 crc kubenswrapper[4891]: I0929 10:04:01.958200 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm" (OuterVolumeSpecName: "kube-api-access-tfpcm") pod "d9326b29-45ea-4953-ab07-fe1812d0fdde" (UID: "d9326b29-45ea-4953-ab07-fe1812d0fdde"). InnerVolumeSpecName "kube-api-access-tfpcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.055328 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfpcm\" (UniqueName: \"kubernetes.io/projected/d9326b29-45ea-4953-ab07-fe1812d0fdde-kube-api-access-tfpcm\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.055365 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.055375 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9326b29-45ea-4953-ab07-fe1812d0fdde-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.212174 4891 generic.go:334] "Generic (PLEG): container finished" podID="71a06463-0c24-4e9a-a4e7-4b0143207f46" containerID="31e4152ce3c5caff32e1310bd85cd2881c1c7b1d585f5618e270453b31b023fe" exitCode=0 Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.212289 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71a06463-0c24-4e9a-a4e7-4b0143207f46","Type":"ContainerDied","Data":"31e4152ce3c5caff32e1310bd85cd2881c1c7b1d585f5618e270453b31b023fe"} Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.217009 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" event={"ID":"5435582b-ebc3-42e5-8a85-65eef8fe8263","Type":"ContainerDied","Data":"ece187dcc289e34fd72d355309f3b66d2808ef7394f7268c7fd0ec0e621f551b"} Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.217116 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtbzs" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.218770 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" event={"ID":"d9326b29-45ea-4953-ab07-fe1812d0fdde","Type":"ContainerDied","Data":"9ca049e05351e5c8386119dac7d33b86ceb9b96e58c37fb68725848c381bf717"} Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.218856 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-snz4k" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.307973 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.310026 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtbzs"] Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.367484 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.372708 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-snz4k"] Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.478220 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5435582b-ebc3-42e5-8a85-65eef8fe8263" path="/var/lib/kubelet/pods/5435582b-ebc3-42e5-8a85-65eef8fe8263/volumes" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.478759 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9326b29-45ea-4953-ab07-fe1812d0fdde" path="/var/lib/kubelet/pods/d9326b29-45ea-4953-ab07-fe1812d0fdde/volumes" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.524101 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 29 10:04:02 crc kubenswrapper[4891]: I0929 10:04:02.906352 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-czcgf"] Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.052531 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.092687 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:03 crc kubenswrapper[4891]: W0929 10:04:03.104360 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b7933b_8030_4832_87af_ee18342581b8.slice/crio-963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a WatchSource:0}: Error finding container 963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a: Status 404 returned error can't find the container with id 963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.229268 4891 generic.go:334] "Generic (PLEG): container finished" podID="c1fc0a48-e2b3-479b-948c-ff2279a7205c" containerID="711c92cb0e6a373963bcf1b675bd6a74cc7393fcec4f1a8399307740fa323e9c" exitCode=0 Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.229838 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vxc7p" event={"ID":"c1fc0a48-e2b3-479b-948c-ff2279a7205c","Type":"ContainerDied","Data":"711c92cb0e6a373963bcf1b675bd6a74cc7393fcec4f1a8399307740fa323e9c"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.232702 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c6033a7a-34ac-409d-ab81-035b291364aa","Type":"ContainerStarted","Data":"6bc02eb99b2a653c76002857089418cdeac78c9443edf4771a546450659e4773"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.236693 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"807cd996-3d20-4f16-b5bb-3b4e4da82775","Type":"ContainerStarted","Data":"1f2c89e594e2ba7a35199e197dc80950c58dfab58084c418c1bfcf16c22aa2df"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.256073 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26add200-3f00-406b-8d30-565e1e51fbd3","Type":"ContainerStarted","Data":"e18b5cf46f2213cd2f27379c1f86a965f1d2ad146ae335af2508a163f4105793"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.266044 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jq4xk" event={"ID":"7484acb7-f4b2-417b-a478-86b8c5999c34","Type":"ContainerStarted","Data":"0f9c16b504259a6abeb0283867a8fa6365ac488e5e8afd919d1f4c9d9fc12b85"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.266368 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jq4xk" Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.268832 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" event={"ID":"3a057026-5e22-4a88-b9ac-15ff57f5a9e2","Type":"ContainerStarted","Data":"e57f9e7eaa091cfb0d1a67ddcaec81d7a90c70e3cfd0b6e75e6a990422e301f6"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.272960 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71a06463-0c24-4e9a-a4e7-4b0143207f46","Type":"ContainerStarted","Data":"a435f7153b16512117d7b58a33b0f3c3a155570cef7d1f8721202cc88b24d06a"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.274622 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q75rz" event={"ID":"a1b7933b-8030-4832-87af-ee18342581b8","Type":"ContainerStarted","Data":"963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.281212 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.872571724 podStartE2EDuration="33.281186383s" podCreationTimestamp="2025-09-29 10:03:30 +0000 UTC" firstStartedPulling="2025-09-29 10:03:32.302878211 +0000 UTC m=+942.508046532" lastFinishedPulling="2025-09-29 10:03:56.71149287 +0000 UTC m=+966.916661191" observedRunningTime="2025-09-29 10:04:03.276563827 +0000 UTC m=+973.481732138" watchObservedRunningTime="2025-09-29 10:04:03.281186383 +0000 UTC m=+973.486354704" Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.284512 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-czcgf" event={"ID":"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0","Type":"ContainerStarted","Data":"58592701b9c117d00551f8079b08c63692c4f84b66c4b1bac8c1b488e3d38a33"} Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.326049 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223372003.528753 podStartE2EDuration="33.32602219s" podCreationTimestamp="2025-09-29 10:03:30 +0000 UTC" firstStartedPulling="2025-09-29 10:03:32.853698457 +0000 UTC m=+943.058866778" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:03.31650809 +0000 UTC m=+973.521676421" watchObservedRunningTime="2025-09-29 10:04:03.32602219 +0000 UTC m=+973.531190521" Sep 29 10:04:03 crc kubenswrapper[4891]: I0929 10:04:03.342501 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jq4xk" podStartSLOduration=22.946386963 podStartE2EDuration="27.342479303s" podCreationTimestamp="2025-09-29 10:03:36 +0000 UTC" firstStartedPulling="2025-09-29 10:03:57.773176769 +0000 UTC m=+967.978345080" lastFinishedPulling="2025-09-29 10:04:02.169269099 +0000 UTC m=+972.374437420" observedRunningTime="2025-09-29 10:04:03.340937648 +0000 UTC m=+973.546105989" watchObservedRunningTime="2025-09-29 10:04:03.342479303 +0000 UTC m=+973.547647624" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.269348 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.294484 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.296121 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.313386 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.335120 4891 generic.go:334] "Generic (PLEG): container finished" podID="a1b7933b-8030-4832-87af-ee18342581b8" containerID="c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9" exitCode=0 Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.335345 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q75rz" event={"ID":"a1b7933b-8030-4832-87af-ee18342581b8","Type":"ContainerDied","Data":"c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9"} Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.352516 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vxc7p" event={"ID":"c1fc0a48-e2b3-479b-948c-ff2279a7205c","Type":"ContainerStarted","Data":"d729fe4c82311a068cba9185b7a6deb1f4d6f4ae8a198fdd9f87bcc43b629df3"} Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.352585 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vxc7p" event={"ID":"c1fc0a48-e2b3-479b-948c-ff2279a7205c","Type":"ContainerStarted","Data":"0e4af6d2ef927951f172ee7cae10ebe7e97e6b0cd74c65819194a4cf004d55b8"} Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.352698 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.352731 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.375855 4891 generic.go:334] "Generic (PLEG): container finished" podID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" containerID="183100a9ebf832efd96614e494acdb49728058bb828f714b7944052442033ad8" exitCode=0 Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.379028 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" event={"ID":"3a057026-5e22-4a88-b9ac-15ff57f5a9e2","Type":"ContainerDied","Data":"183100a9ebf832efd96614e494acdb49728058bb828f714b7944052442033ad8"} Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.393766 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.420322 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6scf\" (UniqueName: \"kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.420410 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.420464 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.420546 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.420579 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.427753 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vxc7p" podStartSLOduration=23.814725123 podStartE2EDuration="28.427711573s" podCreationTimestamp="2025-09-29 10:03:36 +0000 UTC" firstStartedPulling="2025-09-29 10:03:57.556235358 +0000 UTC m=+967.761403679" lastFinishedPulling="2025-09-29 10:04:02.169221818 +0000 UTC m=+972.374390129" observedRunningTime="2025-09-29 10:04:04.394874409 +0000 UTC m=+974.600042730" watchObservedRunningTime="2025-09-29 10:04:04.427711573 +0000 UTC m=+974.632879894" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.483376 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerStarted","Data":"a53fb837cafbe9f35e7bc1f41d1176489dd2cd6ae2323976c557b3e7bf630487"} Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.522231 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.522663 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.522711 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.522748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6scf\" (UniqueName: \"kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.522867 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.525269 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.525975 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.527769 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.530114 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.551541 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6scf\" (UniqueName: \"kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf\") pod \"dnsmasq-dns-b8fbc5445-kb7z9\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: I0929 10:04:04.647560 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.822759 4891 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 29 10:04:04 crc kubenswrapper[4891]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/a1b7933b-8030-4832-87af-ee18342581b8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:04:04 crc kubenswrapper[4891]: > podSandboxID="963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.823316 4891 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 29 10:04:04 crc kubenswrapper[4891]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7k4zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-q75rz_openstack(a1b7933b-8030-4832-87af-ee18342581b8): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/a1b7933b-8030-4832-87af-ee18342581b8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:04:04 crc kubenswrapper[4891]: > logger="UnhandledError" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.824511 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/a1b7933b-8030-4832-87af-ee18342581b8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-q75rz" podUID="a1b7933b-8030-4832-87af-ee18342581b8" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.838159 4891 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 29 10:04:04 crc kubenswrapper[4891]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3a057026-5e22-4a88-b9ac-15ff57f5a9e2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:04:04 crc kubenswrapper[4891]: > podSandboxID="e57f9e7eaa091cfb0d1a67ddcaec81d7a90c70e3cfd0b6e75e6a990422e301f6" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.838354 4891 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 29 10:04:04 crc kubenswrapper[4891]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh65dh95hf6h595hf6hf5h59dh6h57dh558h55ch5dbh5f5h565h5f7h9fh76h58ch54dh84h59bh7fh6bh5b9h59h67fh566h56h5f4h554h58fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg6q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5bf47b49b7-2cgdn_openstack(3a057026-5e22-4a88-b9ac-15ff57f5a9e2): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3a057026-5e22-4a88-b9ac-15ff57f5a9e2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:04:04 crc kubenswrapper[4891]: > logger="UnhandledError" Sep 29 10:04:04 crc kubenswrapper[4891]: E0929 10:04:04.839549 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3a057026-5e22-4a88-b9ac-15ff57f5a9e2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" podUID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.143527 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.411203 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.424414 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.426418 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.426614 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.426924 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.428700 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-v5trl" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.428749 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.435890 4891 generic.go:334] "Generic (PLEG): container finished" podID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerID="6b5bf3146316cd7c22d46d1eac96abb94b2b092b396e9815cee8daabd97e6b16" exitCode=0 Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.435959 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" event={"ID":"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897","Type":"ContainerDied","Data":"6b5bf3146316cd7c22d46d1eac96abb94b2b092b396e9815cee8daabd97e6b16"} Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.436026 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" event={"ID":"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897","Type":"ContainerStarted","Data":"488268ea32d4ca9973cd600839d9a09ca3933f4c08e836b183fd4f63b68c026d"} Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.545990 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.546128 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-cache\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.546162 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-lock\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.546198 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.546402 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vtp2\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-kube-api-access-5vtp2\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.648618 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vtp2\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-kube-api-access-5vtp2\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.650015 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.650688 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-cache\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.650740 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-lock\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.650769 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.651405 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: E0929 10:04:05.653905 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:05 crc kubenswrapper[4891]: E0929 10:04:05.653970 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:05 crc kubenswrapper[4891]: E0929 10:04:05.654065 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:06.154038127 +0000 UTC m=+976.359206448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.654825 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-cache\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.655010 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-lock\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.679519 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:05 crc kubenswrapper[4891]: I0929 10:04:05.679695 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vtp2\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-kube-api-access-5vtp2\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.161125 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:06 crc kubenswrapper[4891]: E0929 10:04:06.161323 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:06 crc kubenswrapper[4891]: E0929 10:04:06.161345 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:06 crc kubenswrapper[4891]: E0929 10:04:06.161400 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:07.161380846 +0000 UTC m=+977.366549167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.186736 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.186840 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.750357 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.876237 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc\") pod \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.876311 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config\") pod \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.876412 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb\") pod \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.876452 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6q5\" (UniqueName: \"kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5\") pod \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\" (UID: \"3a057026-5e22-4a88-b9ac-15ff57f5a9e2\") " Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.890287 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5" (OuterVolumeSpecName: "kube-api-access-xg6q5") pod "3a057026-5e22-4a88-b9ac-15ff57f5a9e2" (UID: "3a057026-5e22-4a88-b9ac-15ff57f5a9e2"). InnerVolumeSpecName "kube-api-access-xg6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.977441 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a057026-5e22-4a88-b9ac-15ff57f5a9e2" (UID: "3a057026-5e22-4a88-b9ac-15ff57f5a9e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.978734 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.978755 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6q5\" (UniqueName: \"kubernetes.io/projected/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-kube-api-access-xg6q5\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:06 crc kubenswrapper[4891]: I0929 10:04:06.988040 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a057026-5e22-4a88-b9ac-15ff57f5a9e2" (UID: "3a057026-5e22-4a88-b9ac-15ff57f5a9e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.007281 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config" (OuterVolumeSpecName: "config") pod "3a057026-5e22-4a88-b9ac-15ff57f5a9e2" (UID: "3a057026-5e22-4a88-b9ac-15ff57f5a9e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.082689 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.082737 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a057026-5e22-4a88-b9ac-15ff57f5a9e2-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.184300 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:07 crc kubenswrapper[4891]: E0929 10:04:07.184612 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:07 crc kubenswrapper[4891]: E0929 10:04:07.184655 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:07 crc kubenswrapper[4891]: E0929 10:04:07.184746 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:09.184717189 +0000 UTC m=+979.389885510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.453932 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" event={"ID":"3a057026-5e22-4a88-b9ac-15ff57f5a9e2","Type":"ContainerDied","Data":"e57f9e7eaa091cfb0d1a67ddcaec81d7a90c70e3cfd0b6e75e6a990422e301f6"} Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.454281 4891 scope.go:117] "RemoveContainer" containerID="183100a9ebf832efd96614e494acdb49728058bb828f714b7944052442033ad8" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.454468 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2cgdn" Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.525375 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:07 crc kubenswrapper[4891]: I0929 10:04:07.534257 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2cgdn"] Sep 29 10:04:07 crc kubenswrapper[4891]: E0929 10:04:07.570107 4891 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.151:60404->38.102.83.151:34867: read tcp 38.102.83.151:60404->38.102.83.151:34867: read: connection reset by peer Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.411262 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" path="/var/lib/kubelet/pods/3a057026-5e22-4a88-b9ac-15ff57f5a9e2/volumes" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.467518 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"807cd996-3d20-4f16-b5bb-3b4e4da82775","Type":"ContainerStarted","Data":"7e51835a4f2b115663e9254ab93381247b92d853800834df1eeaddd8c881243a"} Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.471450 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q75rz" event={"ID":"a1b7933b-8030-4832-87af-ee18342581b8","Type":"ContainerStarted","Data":"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553"} Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.471843 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.475778 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" event={"ID":"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897","Type":"ContainerStarted","Data":"11d02efd4b91519f57a0828c04b345f86fcb0f8b36b5fdf2e95bfed3c64949d0"} Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.478559 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-czcgf" event={"ID":"aeab10b4-2f08-4eed-88eb-ba6f26db6cd0","Type":"ContainerStarted","Data":"ff5ea986c5d2500c7ea62443e80262cb9fcffeb71b142315e00bde9a1ddcce11"} Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.480691 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c6033a7a-34ac-409d-ab81-035b291364aa","Type":"ContainerStarted","Data":"4b93f89802c0ca11e2a042492e0d5123fa1f9a56f6851a818827689402f83cce"} Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.504310 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.770785096 podStartE2EDuration="30.504278651s" podCreationTimestamp="2025-09-29 10:03:38 +0000 UTC" firstStartedPulling="2025-09-29 10:03:57.828410241 +0000 UTC m=+968.033578562" lastFinishedPulling="2025-09-29 10:04:07.561903776 +0000 UTC m=+977.767072117" observedRunningTime="2025-09-29 10:04:08.499030966 +0000 UTC m=+978.704199307" watchObservedRunningTime="2025-09-29 10:04:08.504278651 +0000 UTC m=+978.709446982" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.529212 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-czcgf" podStartSLOduration=3.880313108 podStartE2EDuration="8.529187132s" podCreationTimestamp="2025-09-29 10:04:00 +0000 UTC" firstStartedPulling="2025-09-29 10:04:02.912834566 +0000 UTC m=+973.118002887" lastFinishedPulling="2025-09-29 10:04:07.56170859 +0000 UTC m=+977.766876911" observedRunningTime="2025-09-29 10:04:08.521523677 +0000 UTC m=+978.726691998" watchObservedRunningTime="2025-09-29 10:04:08.529187132 +0000 UTC m=+978.734355453" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.565883 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-q75rz" podStartSLOduration=7.931679375 podStartE2EDuration="8.565864199s" podCreationTimestamp="2025-09-29 10:04:00 +0000 UTC" firstStartedPulling="2025-09-29 10:04:03.106214455 +0000 UTC m=+973.311382776" lastFinishedPulling="2025-09-29 10:04:03.740399279 +0000 UTC m=+973.945567600" observedRunningTime="2025-09-29 10:04:08.56453881 +0000 UTC m=+978.769707161" watchObservedRunningTime="2025-09-29 10:04:08.565864199 +0000 UTC m=+978.771032520" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.597318 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.783502249 podStartE2EDuration="28.597291282s" podCreationTimestamp="2025-09-29 10:03:40 +0000 UTC" firstStartedPulling="2025-09-29 10:03:57.769601004 +0000 UTC m=+967.974769335" lastFinishedPulling="2025-09-29 10:04:07.583390047 +0000 UTC m=+977.788558368" observedRunningTime="2025-09-29 10:04:08.589536924 +0000 UTC m=+978.794705245" watchObservedRunningTime="2025-09-29 10:04:08.597291282 +0000 UTC m=+978.802459603" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.657222 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" podStartSLOduration=4.657191101 podStartE2EDuration="4.657191101s" podCreationTimestamp="2025-09-29 10:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:08.617858406 +0000 UTC m=+978.823026757" watchObservedRunningTime="2025-09-29 10:04:08.657191101 +0000 UTC m=+978.862359412" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.660657 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 29 10:04:08 crc kubenswrapper[4891]: I0929 10:04:08.711284 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.228084 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:09 crc kubenswrapper[4891]: E0929 10:04:09.228403 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:09 crc kubenswrapper[4891]: E0929 10:04:09.228897 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:09 crc kubenswrapper[4891]: E0929 10:04:09.229002 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:13.228970762 +0000 UTC m=+983.434139133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.384420 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fnxwg"] Sep 29 10:04:09 crc kubenswrapper[4891]: E0929 10:04:09.385025 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" containerName="init" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.385066 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" containerName="init" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.385353 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a057026-5e22-4a88-b9ac-15ff57f5a9e2" containerName="init" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.386320 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.389576 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.391051 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.391297 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.394614 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fnxwg"] Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.512560 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.513199 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535362 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535416 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555mq\" (UniqueName: \"kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535441 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535482 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535525 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535542 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.535637 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.555530 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.638020 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.639005 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555mq\" (UniqueName: \"kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.639643 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.639836 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.639978 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.640000 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.640154 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.640758 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.641333 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.641361 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.643146 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.644602 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.645334 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.657873 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555mq\" (UniqueName: \"kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq\") pod \"swift-ring-rebalance-fnxwg\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:09 crc kubenswrapper[4891]: I0929 10:04:09.710510 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.090434 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.090895 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.160411 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.201151 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fnxwg"] Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.520488 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerStarted","Data":"f35fd94d86fef28eca621ae9c5c625c1d28e5bd5c8a371383e8e27de6992ac12"} Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.521612 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnxwg" event={"ID":"becd282d-9d1a-4bf8-8e48-cdbab75047e1","Type":"ContainerStarted","Data":"c64747c6f0e341ad559a323c1441911eff1d4f5bdf949ba62a8eb6559998f011"} Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.568447 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.773656 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.775738 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.781126 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h4v4t" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.781188 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.781278 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.781455 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.792011 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.879712 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.879817 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djjfb\" (UniqueName: \"kubernetes.io/projected/942ef260-597a-42db-9123-1e9e0b1c4e1b-kube-api-access-djjfb\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.879866 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.879918 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-scripts\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.880043 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.880102 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.880246 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-config\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982658 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982720 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djjfb\" (UniqueName: \"kubernetes.io/projected/942ef260-597a-42db-9123-1e9e0b1c4e1b-kube-api-access-djjfb\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982755 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982776 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-scripts\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982938 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.982966 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.983008 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-config\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.984059 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-config\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.984645 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/942ef260-597a-42db-9123-1e9e0b1c4e1b-scripts\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.984450 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.991667 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.992782 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:10 crc kubenswrapper[4891]: I0929 10:04:10.993183 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942ef260-597a-42db-9123-1e9e0b1c4e1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.007704 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djjfb\" (UniqueName: \"kubernetes.io/projected/942ef260-597a-42db-9123-1e9e0b1c4e1b-kube-api-access-djjfb\") pod \"ovn-northd-0\" (UID: \"942ef260-597a-42db-9123-1e9e0b1c4e1b\") " pod="openstack/ovn-northd-0" Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.107763 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.563222 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.605934 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.606343 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 29 10:04:11 crc kubenswrapper[4891]: I0929 10:04:11.664617 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.059134 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.059189 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.128422 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.539008 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"942ef260-597a-42db-9123-1e9e0b1c4e1b","Type":"ContainerStarted","Data":"bf453bacfe5287e740432dba02cae5818640fc2ae3de50d2d700d3fe85607fa4"} Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.596120 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.601715 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.874027 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-tfj4j"] Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.877678 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.891507 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tfj4j"] Sep 29 10:04:12 crc kubenswrapper[4891]: I0929 10:04:12.992956 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p88m\" (UniqueName: \"kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m\") pod \"glance-db-create-tfj4j\" (UID: \"e8429397-c754-44a8-bda3-9162297c7093\") " pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:13 crc kubenswrapper[4891]: I0929 10:04:13.095184 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p88m\" (UniqueName: \"kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m\") pod \"glance-db-create-tfj4j\" (UID: \"e8429397-c754-44a8-bda3-9162297c7093\") " pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:13 crc kubenswrapper[4891]: I0929 10:04:13.122576 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p88m\" (UniqueName: \"kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m\") pod \"glance-db-create-tfj4j\" (UID: \"e8429397-c754-44a8-bda3-9162297c7093\") " pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:13 crc kubenswrapper[4891]: I0929 10:04:13.216910 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:13 crc kubenswrapper[4891]: I0929 10:04:13.302278 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:13 crc kubenswrapper[4891]: E0929 10:04:13.302568 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:13 crc kubenswrapper[4891]: E0929 10:04:13.302823 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:13 crc kubenswrapper[4891]: E0929 10:04:13.302896 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:21.30287541 +0000 UTC m=+991.508043731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.277105 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tfj4j"] Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.563968 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tfj4j" event={"ID":"e8429397-c754-44a8-bda3-9162297c7093","Type":"ContainerStarted","Data":"da139a3f823329cc73190716d8b8db4bad67f74d5b1df863673417a5a8cfcf80"} Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.566877 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnxwg" event={"ID":"becd282d-9d1a-4bf8-8e48-cdbab75047e1","Type":"ContainerStarted","Data":"8280c6f38412aa695fdcd51ad92966bba44046feb0bb33010886aab09a592264"} Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.650174 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.706328 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-fnxwg" podStartSLOduration=2.036689949 podStartE2EDuration="5.706298595s" podCreationTimestamp="2025-09-29 10:04:09 +0000 UTC" firstStartedPulling="2025-09-29 10:04:10.216288926 +0000 UTC m=+980.421457287" lastFinishedPulling="2025-09-29 10:04:13.885897612 +0000 UTC m=+984.091065933" observedRunningTime="2025-09-29 10:04:14.600076386 +0000 UTC m=+984.805244717" watchObservedRunningTime="2025-09-29 10:04:14.706298595 +0000 UTC m=+984.911466916" Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.721651 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.721979 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-q75rz" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="dnsmasq-dns" containerID="cri-o://00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553" gracePeriod=10 Sep 29 10:04:14 crc kubenswrapper[4891]: I0929 10:04:14.722983 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.219438 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.350708 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb\") pod \"a1b7933b-8030-4832-87af-ee18342581b8\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.350828 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config\") pod \"a1b7933b-8030-4832-87af-ee18342581b8\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.350857 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc\") pod \"a1b7933b-8030-4832-87af-ee18342581b8\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.350947 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k4zt\" (UniqueName: \"kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt\") pod \"a1b7933b-8030-4832-87af-ee18342581b8\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.350987 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb\") pod \"a1b7933b-8030-4832-87af-ee18342581b8\" (UID: \"a1b7933b-8030-4832-87af-ee18342581b8\") " Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.357809 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt" (OuterVolumeSpecName: "kube-api-access-7k4zt") pod "a1b7933b-8030-4832-87af-ee18342581b8" (UID: "a1b7933b-8030-4832-87af-ee18342581b8"). InnerVolumeSpecName "kube-api-access-7k4zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.394961 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1b7933b-8030-4832-87af-ee18342581b8" (UID: "a1b7933b-8030-4832-87af-ee18342581b8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.395135 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1b7933b-8030-4832-87af-ee18342581b8" (UID: "a1b7933b-8030-4832-87af-ee18342581b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.395743 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config" (OuterVolumeSpecName: "config") pod "a1b7933b-8030-4832-87af-ee18342581b8" (UID: "a1b7933b-8030-4832-87af-ee18342581b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.404422 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1b7933b-8030-4832-87af-ee18342581b8" (UID: "a1b7933b-8030-4832-87af-ee18342581b8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.454056 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.454098 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.454107 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.454119 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k4zt\" (UniqueName: \"kubernetes.io/projected/a1b7933b-8030-4832-87af-ee18342581b8-kube-api-access-7k4zt\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.454129 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b7933b-8030-4832-87af-ee18342581b8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.583184 4891 generic.go:334] "Generic (PLEG): container finished" podID="e8429397-c754-44a8-bda3-9162297c7093" containerID="51ee6c62107e38e82f6f0d3462fa457544eb4d0e29f27540795bc5736c8208f8" exitCode=0 Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.583265 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tfj4j" event={"ID":"e8429397-c754-44a8-bda3-9162297c7093","Type":"ContainerDied","Data":"51ee6c62107e38e82f6f0d3462fa457544eb4d0e29f27540795bc5736c8208f8"} Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.597698 4891 generic.go:334] "Generic (PLEG): container finished" podID="a1b7933b-8030-4832-87af-ee18342581b8" containerID="00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553" exitCode=0 Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.597897 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-q75rz" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.599077 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q75rz" event={"ID":"a1b7933b-8030-4832-87af-ee18342581b8","Type":"ContainerDied","Data":"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553"} Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.599148 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-q75rz" event={"ID":"a1b7933b-8030-4832-87af-ee18342581b8","Type":"ContainerDied","Data":"963acb496f4fb2c6b5ffc3b004cb6181541eb20550eb34b0e06734853c13832a"} Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.599170 4891 scope.go:117] "RemoveContainer" containerID="00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.604950 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"942ef260-597a-42db-9123-1e9e0b1c4e1b","Type":"ContainerStarted","Data":"b1d0159b79e25b62de3f1662240ac8a3e45998b14366abb78a5cc6f4dc5085f4"} Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.605004 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.605018 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"942ef260-597a-42db-9123-1e9e0b1c4e1b","Type":"ContainerStarted","Data":"7f28ab42b47751544e0d41ed9e263447880c882ac07b6a79077885eedcf959c4"} Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.630012 4891 scope.go:117] "RemoveContainer" containerID="c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.676743 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.733392028 podStartE2EDuration="5.676716304s" podCreationTimestamp="2025-09-29 10:04:10 +0000 UTC" firstStartedPulling="2025-09-29 10:04:11.577535763 +0000 UTC m=+981.782704084" lastFinishedPulling="2025-09-29 10:04:14.520860039 +0000 UTC m=+984.726028360" observedRunningTime="2025-09-29 10:04:15.656247103 +0000 UTC m=+985.861415424" watchObservedRunningTime="2025-09-29 10:04:15.676716304 +0000 UTC m=+985.881884615" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.683132 4891 scope.go:117] "RemoveContainer" containerID="00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553" Sep 29 10:04:15 crc kubenswrapper[4891]: E0929 10:04:15.686919 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553\": container with ID starting with 00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553 not found: ID does not exist" containerID="00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.686969 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553"} err="failed to get container status \"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553\": rpc error: code = NotFound desc = could not find container \"00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553\": container with ID starting with 00fa1ab15490c2768b4465c68439c0b780e0117eaaeaeda27ccaf56988a80553 not found: ID does not exist" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.687000 4891 scope.go:117] "RemoveContainer" containerID="c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9" Sep 29 10:04:15 crc kubenswrapper[4891]: E0929 10:04:15.690957 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9\": container with ID starting with c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9 not found: ID does not exist" containerID="c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.691011 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9"} err="failed to get container status \"c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9\": rpc error: code = NotFound desc = could not find container \"c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9\": container with ID starting with c0d04d2efd165c06a40b700d7a587bd5ae6fcd619d81350edad6dae13c0cbda9 not found: ID does not exist" Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.702853 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:15 crc kubenswrapper[4891]: I0929 10:04:15.720411 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-q75rz"] Sep 29 10:04:16 crc kubenswrapper[4891]: I0929 10:04:16.414007 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b7933b-8030-4832-87af-ee18342581b8" path="/var/lib/kubelet/pods/a1b7933b-8030-4832-87af-ee18342581b8/volumes" Sep 29 10:04:16 crc kubenswrapper[4891]: I0929 10:04:16.950113 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.087904 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p88m\" (UniqueName: \"kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m\") pod \"e8429397-c754-44a8-bda3-9162297c7093\" (UID: \"e8429397-c754-44a8-bda3-9162297c7093\") " Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.095024 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m" (OuterVolumeSpecName: "kube-api-access-6p88m") pod "e8429397-c754-44a8-bda3-9162297c7093" (UID: "e8429397-c754-44a8-bda3-9162297c7093"). InnerVolumeSpecName "kube-api-access-6p88m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.190942 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p88m\" (UniqueName: \"kubernetes.io/projected/e8429397-c754-44a8-bda3-9162297c7093-kube-api-access-6p88m\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.628131 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tfj4j" event={"ID":"e8429397-c754-44a8-bda3-9162297c7093","Type":"ContainerDied","Data":"da139a3f823329cc73190716d8b8db4bad67f74d5b1df863673417a5a8cfcf80"} Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.628189 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da139a3f823329cc73190716d8b8db4bad67f74d5b1df863673417a5a8cfcf80" Sep 29 10:04:17 crc kubenswrapper[4891]: I0929 10:04:17.628204 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tfj4j" Sep 29 10:04:21 crc kubenswrapper[4891]: I0929 10:04:21.396997 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:21 crc kubenswrapper[4891]: E0929 10:04:21.397191 4891 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:04:21 crc kubenswrapper[4891]: E0929 10:04:21.397590 4891 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:04:21 crc kubenswrapper[4891]: E0929 10:04:21.397704 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift podName:66ec83ce-b4c6-412b-b7c4-6a61c6914c0e nodeName:}" failed. No retries permitted until 2025-09-29 10:04:37.397637251 +0000 UTC m=+1007.602805582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift") pod "swift-storage-0" (UID: "66ec83ce-b4c6-412b-b7c4-6a61c6914c0e") : configmap "swift-ring-files" not found Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.100839 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-r9sf8"] Sep 29 10:04:22 crc kubenswrapper[4891]: E0929 10:04:22.101741 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="dnsmasq-dns" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.101768 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="dnsmasq-dns" Sep 29 10:04:22 crc kubenswrapper[4891]: E0929 10:04:22.101829 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8429397-c754-44a8-bda3-9162297c7093" containerName="mariadb-database-create" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.101839 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8429397-c754-44a8-bda3-9162297c7093" containerName="mariadb-database-create" Sep 29 10:04:22 crc kubenswrapper[4891]: E0929 10:04:22.101875 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="init" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.101885 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="init" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.102100 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b7933b-8030-4832-87af-ee18342581b8" containerName="dnsmasq-dns" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.102120 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8429397-c754-44a8-bda3-9162297c7093" containerName="mariadb-database-create" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.102956 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.117177 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r9sf8"] Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.213684 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccm62\" (UniqueName: \"kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62\") pod \"keystone-db-create-r9sf8\" (UID: \"2ec95c81-cfd3-46a1-befc-581ea6a57bc3\") " pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.315289 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccm62\" (UniqueName: \"kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62\") pod \"keystone-db-create-r9sf8\" (UID: \"2ec95c81-cfd3-46a1-befc-581ea6a57bc3\") " pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.337029 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccm62\" (UniqueName: \"kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62\") pod \"keystone-db-create-r9sf8\" (UID: \"2ec95c81-cfd3-46a1-befc-581ea6a57bc3\") " pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.407643 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-c4dvn"] Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.412018 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.450103 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c4dvn"] Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.488841 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.528660 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tq5\" (UniqueName: \"kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5\") pod \"placement-db-create-c4dvn\" (UID: \"9a561e40-cae5-4c35-9f8b-9424f4aa61a5\") " pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.631714 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tq5\" (UniqueName: \"kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5\") pod \"placement-db-create-c4dvn\" (UID: \"9a561e40-cae5-4c35-9f8b-9424f4aa61a5\") " pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.687687 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tq5\" (UniqueName: \"kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5\") pod \"placement-db-create-c4dvn\" (UID: \"9a561e40-cae5-4c35-9f8b-9424f4aa61a5\") " pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.699488 4891 generic.go:334] "Generic (PLEG): container finished" podID="becd282d-9d1a-4bf8-8e48-cdbab75047e1" containerID="8280c6f38412aa695fdcd51ad92966bba44046feb0bb33010886aab09a592264" exitCode=0 Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.699575 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnxwg" event={"ID":"becd282d-9d1a-4bf8-8e48-cdbab75047e1","Type":"ContainerDied","Data":"8280c6f38412aa695fdcd51ad92966bba44046feb0bb33010886aab09a592264"} Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.749163 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:22 crc kubenswrapper[4891]: I0929 10:04:22.948828 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r9sf8"] Sep 29 10:04:22 crc kubenswrapper[4891]: W0929 10:04:22.956102 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ec95c81_cfd3_46a1_befc_581ea6a57bc3.slice/crio-0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3 WatchSource:0}: Error finding container 0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3: Status 404 returned error can't find the container with id 0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3 Sep 29 10:04:23 crc kubenswrapper[4891]: W0929 10:04:23.209726 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a561e40_cae5_4c35_9f8b_9424f4aa61a5.slice/crio-2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3 WatchSource:0}: Error finding container 2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3: Status 404 returned error can't find the container with id 2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3 Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.210036 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c4dvn"] Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.711886 4891 generic.go:334] "Generic (PLEG): container finished" podID="9a561e40-cae5-4c35-9f8b-9424f4aa61a5" containerID="500e7d112bcc7de537acd25ecbcaeb58ef79020b66a6f7e14c54bddd259219a3" exitCode=0 Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.711978 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c4dvn" event={"ID":"9a561e40-cae5-4c35-9f8b-9424f4aa61a5","Type":"ContainerDied","Data":"500e7d112bcc7de537acd25ecbcaeb58ef79020b66a6f7e14c54bddd259219a3"} Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.712016 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c4dvn" event={"ID":"9a561e40-cae5-4c35-9f8b-9424f4aa61a5","Type":"ContainerStarted","Data":"2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3"} Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.714641 4891 generic.go:334] "Generic (PLEG): container finished" podID="2ec95c81-cfd3-46a1-befc-581ea6a57bc3" containerID="4442c7ee92e3b29486eece5f6bf2f6429ca7849604a4f6e5195602ae26b0fd56" exitCode=0 Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.714997 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r9sf8" event={"ID":"2ec95c81-cfd3-46a1-befc-581ea6a57bc3","Type":"ContainerDied","Data":"4442c7ee92e3b29486eece5f6bf2f6429ca7849604a4f6e5195602ae26b0fd56"} Sep 29 10:04:23 crc kubenswrapper[4891]: I0929 10:04:23.715103 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r9sf8" event={"ID":"2ec95c81-cfd3-46a1-befc-581ea6a57bc3","Type":"ContainerStarted","Data":"0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3"} Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.068084 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072227 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072352 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072457 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072570 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072714 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-555mq\" (UniqueName: \"kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072859 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.072892 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf\") pod \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\" (UID: \"becd282d-9d1a-4bf8-8e48-cdbab75047e1\") " Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.074648 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.075749 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.080907 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq" (OuterVolumeSpecName: "kube-api-access-555mq") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "kube-api-access-555mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.086719 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.105136 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.105623 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts" (OuterVolumeSpecName: "scripts") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.124917 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "becd282d-9d1a-4bf8-8e48-cdbab75047e1" (UID: "becd282d-9d1a-4bf8-8e48-cdbab75047e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.175385 4891 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.175740 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.175858 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-555mq\" (UniqueName: \"kubernetes.io/projected/becd282d-9d1a-4bf8-8e48-cdbab75047e1-kube-api-access-555mq\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.175977 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/becd282d-9d1a-4bf8-8e48-cdbab75047e1-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.176054 4891 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.176123 4891 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/becd282d-9d1a-4bf8-8e48-cdbab75047e1-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.176250 4891 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/becd282d-9d1a-4bf8-8e48-cdbab75047e1-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.727729 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnxwg" event={"ID":"becd282d-9d1a-4bf8-8e48-cdbab75047e1","Type":"ContainerDied","Data":"c64747c6f0e341ad559a323c1441911eff1d4f5bdf949ba62a8eb6559998f011"} Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.727889 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64747c6f0e341ad559a323c1441911eff1d4f5bdf949ba62a8eb6559998f011" Sep 29 10:04:24 crc kubenswrapper[4891]: I0929 10:04:24.730012 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnxwg" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.234295 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.251750 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.298887 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccm62\" (UniqueName: \"kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62\") pod \"2ec95c81-cfd3-46a1-befc-581ea6a57bc3\" (UID: \"2ec95c81-cfd3-46a1-befc-581ea6a57bc3\") " Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.299382 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72tq5\" (UniqueName: \"kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5\") pod \"9a561e40-cae5-4c35-9f8b-9424f4aa61a5\" (UID: \"9a561e40-cae5-4c35-9f8b-9424f4aa61a5\") " Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.304155 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5" (OuterVolumeSpecName: "kube-api-access-72tq5") pod "9a561e40-cae5-4c35-9f8b-9424f4aa61a5" (UID: "9a561e40-cae5-4c35-9f8b-9424f4aa61a5"). InnerVolumeSpecName "kube-api-access-72tq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.307298 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62" (OuterVolumeSpecName: "kube-api-access-ccm62") pod "2ec95c81-cfd3-46a1-befc-581ea6a57bc3" (UID: "2ec95c81-cfd3-46a1-befc-581ea6a57bc3"). InnerVolumeSpecName "kube-api-access-ccm62". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.401683 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccm62\" (UniqueName: \"kubernetes.io/projected/2ec95c81-cfd3-46a1-befc-581ea6a57bc3-kube-api-access-ccm62\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.401730 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72tq5\" (UniqueName: \"kubernetes.io/projected/9a561e40-cae5-4c35-9f8b-9424f4aa61a5-kube-api-access-72tq5\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.742309 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c4dvn" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.742330 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c4dvn" event={"ID":"9a561e40-cae5-4c35-9f8b-9424f4aa61a5","Type":"ContainerDied","Data":"2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3"} Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.743247 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd8f76a4e5c9d9f8b89b5b7c1123fd3a663ef7bac5b3323135738ff303cb9e3" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.745253 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r9sf8" event={"ID":"2ec95c81-cfd3-46a1-befc-581ea6a57bc3","Type":"ContainerDied","Data":"0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3"} Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.745385 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f00fbed34ebdcf456151338f727273a1761583e8480ae7898746c6ee542d2a3" Sep 29 10:04:25 crc kubenswrapper[4891]: I0929 10:04:25.745524 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r9sf8" Sep 29 10:04:26 crc kubenswrapper[4891]: I0929 10:04:26.183003 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.254446 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d62-account-create-dnmtw"] Sep 29 10:04:32 crc kubenswrapper[4891]: E0929 10:04:32.257170 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a561e40-cae5-4c35-9f8b-9424f4aa61a5" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257381 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a561e40-cae5-4c35-9f8b-9424f4aa61a5" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: E0929 10:04:32.257422 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec95c81-cfd3-46a1-befc-581ea6a57bc3" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257435 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec95c81-cfd3-46a1-befc-581ea6a57bc3" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: E0929 10:04:32.257459 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becd282d-9d1a-4bf8-8e48-cdbab75047e1" containerName="swift-ring-rebalance" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257471 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="becd282d-9d1a-4bf8-8e48-cdbab75047e1" containerName="swift-ring-rebalance" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257757 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a561e40-cae5-4c35-9f8b-9424f4aa61a5" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257809 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec95c81-cfd3-46a1-befc-581ea6a57bc3" containerName="mariadb-database-create" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.257845 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="becd282d-9d1a-4bf8-8e48-cdbab75047e1" containerName="swift-ring-rebalance" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.258830 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.264049 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d62-account-create-dnmtw"] Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.264272 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.290899 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jq4xk" podUID="7484acb7-f4b2-417b-a478-86b8c5999c34" containerName="ovn-controller" probeResult="failure" output=< Sep 29 10:04:32 crc kubenswrapper[4891]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 29 10:04:32 crc kubenswrapper[4891]: > Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.358444 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz29j\" (UniqueName: \"kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j\") pod \"keystone-6d62-account-create-dnmtw\" (UID: \"30cc69b2-dc90-486e-8d87-3ff906ee4288\") " pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.425211 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-83c1-account-create-td9vd"] Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.426885 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.429906 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.439078 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83c1-account-create-td9vd"] Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.461266 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k59sp\" (UniqueName: \"kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp\") pod \"placement-83c1-account-create-td9vd\" (UID: \"9ac4d0d2-6597-4879-818c-7d0748094e3e\") " pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.461386 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz29j\" (UniqueName: \"kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j\") pod \"keystone-6d62-account-create-dnmtw\" (UID: \"30cc69b2-dc90-486e-8d87-3ff906ee4288\") " pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.485496 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz29j\" (UniqueName: \"kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j\") pod \"keystone-6d62-account-create-dnmtw\" (UID: \"30cc69b2-dc90-486e-8d87-3ff906ee4288\") " pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.563055 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k59sp\" (UniqueName: \"kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp\") pod \"placement-83c1-account-create-td9vd\" (UID: \"9ac4d0d2-6597-4879-818c-7d0748094e3e\") " pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.579998 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k59sp\" (UniqueName: \"kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp\") pod \"placement-83c1-account-create-td9vd\" (UID: \"9ac4d0d2-6597-4879-818c-7d0748094e3e\") " pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.599813 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.746079 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.917969 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c87a-account-create-zvx6w"] Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.919543 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.927547 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.928433 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c87a-account-create-zvx6w"] Sep 29 10:04:32 crc kubenswrapper[4891]: I0929 10:04:32.972338 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpx8n\" (UniqueName: \"kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n\") pod \"glance-c87a-account-create-zvx6w\" (UID: \"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80\") " pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.061219 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d62-account-create-dnmtw"] Sep 29 10:04:33 crc kubenswrapper[4891]: W0929 10:04:33.063968 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30cc69b2_dc90_486e_8d87_3ff906ee4288.slice/crio-7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470 WatchSource:0}: Error finding container 7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470: Status 404 returned error can't find the container with id 7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.073559 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpx8n\" (UniqueName: \"kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n\") pod \"glance-c87a-account-create-zvx6w\" (UID: \"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80\") " pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.096879 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpx8n\" (UniqueName: \"kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n\") pod \"glance-c87a-account-create-zvx6w\" (UID: \"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80\") " pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.243827 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.299683 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83c1-account-create-td9vd"] Sep 29 10:04:33 crc kubenswrapper[4891]: W0929 10:04:33.305706 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ac4d0d2_6597_4879_818c_7d0748094e3e.slice/crio-9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0 WatchSource:0}: Error finding container 9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0: Status 404 returned error can't find the container with id 9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.773244 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c87a-account-create-zvx6w"] Sep 29 10:04:33 crc kubenswrapper[4891]: W0929 10:04:33.776601 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f29b0d_c5e6_48ea_b1b0_01649ee7ec80.slice/crio-fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232 WatchSource:0}: Error finding container fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232: Status 404 returned error can't find the container with id fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.831870 4891 generic.go:334] "Generic (PLEG): container finished" podID="30cc69b2-dc90-486e-8d87-3ff906ee4288" containerID="9ab1bf7e06c3a6c07df39378e627acc0e03a039a5b5d414ca236df7617c23ffe" exitCode=0 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.831966 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d62-account-create-dnmtw" event={"ID":"30cc69b2-dc90-486e-8d87-3ff906ee4288","Type":"ContainerDied","Data":"9ab1bf7e06c3a6c07df39378e627acc0e03a039a5b5d414ca236df7617c23ffe"} Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.832003 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d62-account-create-dnmtw" event={"ID":"30cc69b2-dc90-486e-8d87-3ff906ee4288","Type":"ContainerStarted","Data":"7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470"} Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.833539 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c87a-account-create-zvx6w" event={"ID":"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80","Type":"ContainerStarted","Data":"fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232"} Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.835599 4891 generic.go:334] "Generic (PLEG): container finished" podID="8fd6ea18-7472-42de-b949-140181cd55a5" containerID="a53fb837cafbe9f35e7bc1f41d1176489dd2cd6ae2323976c557b3e7bf630487" exitCode=0 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.835685 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerDied","Data":"a53fb837cafbe9f35e7bc1f41d1176489dd2cd6ae2323976c557b3e7bf630487"} Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.837888 4891 generic.go:334] "Generic (PLEG): container finished" podID="9ac4d0d2-6597-4879-818c-7d0748094e3e" containerID="74432fd4470f11739ef6600a7c239f5ea978893cc33261af3c2634446e285ed5" exitCode=0 Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.838017 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83c1-account-create-td9vd" event={"ID":"9ac4d0d2-6597-4879-818c-7d0748094e3e","Type":"ContainerDied","Data":"74432fd4470f11739ef6600a7c239f5ea978893cc33261af3c2634446e285ed5"} Sep 29 10:04:33 crc kubenswrapper[4891]: I0929 10:04:33.838056 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83c1-account-create-td9vd" event={"ID":"9ac4d0d2-6597-4879-818c-7d0748094e3e","Type":"ContainerStarted","Data":"9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0"} Sep 29 10:04:33 crc kubenswrapper[4891]: E0929 10:04:33.892200 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd6ea18_7472_42de_b949_140181cd55a5.slice/crio-a53fb837cafbe9f35e7bc1f41d1176489dd2cd6ae2323976c557b3e7bf630487.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:04:34 crc kubenswrapper[4891]: I0929 10:04:34.850642 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerStarted","Data":"118aac32b72509b12aeebfcc8ea409e79f304c4dcb0b284ed327e6f8bff96902"} Sep 29 10:04:34 crc kubenswrapper[4891]: I0929 10:04:34.851357 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:04:34 crc kubenswrapper[4891]: I0929 10:04:34.853063 4891 generic.go:334] "Generic (PLEG): container finished" podID="84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" containerID="497efa1909363164e1611b21306d4443d79bd885cb159a68bb97c942238be1dd" exitCode=0 Sep 29 10:04:34 crc kubenswrapper[4891]: I0929 10:04:34.853122 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c87a-account-create-zvx6w" event={"ID":"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80","Type":"ContainerDied","Data":"497efa1909363164e1611b21306d4443d79bd885cb159a68bb97c942238be1dd"} Sep 29 10:04:34 crc kubenswrapper[4891]: I0929 10:04:34.894069 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.224520391 podStartE2EDuration="1m6.89403533s" podCreationTimestamp="2025-09-29 10:03:28 +0000 UTC" firstStartedPulling="2025-09-29 10:03:30.328322723 +0000 UTC m=+940.533491044" lastFinishedPulling="2025-09-29 10:04:02.997837662 +0000 UTC m=+973.203005983" observedRunningTime="2025-09-29 10:04:34.882298345 +0000 UTC m=+1005.087466756" watchObservedRunningTime="2025-09-29 10:04:34.89403533 +0000 UTC m=+1005.099203691" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.297421 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.303926 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.418438 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz29j\" (UniqueName: \"kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j\") pod \"30cc69b2-dc90-486e-8d87-3ff906ee4288\" (UID: \"30cc69b2-dc90-486e-8d87-3ff906ee4288\") " Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.418757 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k59sp\" (UniqueName: \"kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp\") pod \"9ac4d0d2-6597-4879-818c-7d0748094e3e\" (UID: \"9ac4d0d2-6597-4879-818c-7d0748094e3e\") " Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.426577 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j" (OuterVolumeSpecName: "kube-api-access-jz29j") pod "30cc69b2-dc90-486e-8d87-3ff906ee4288" (UID: "30cc69b2-dc90-486e-8d87-3ff906ee4288"). InnerVolumeSpecName "kube-api-access-jz29j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.426663 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp" (OuterVolumeSpecName: "kube-api-access-k59sp") pod "9ac4d0d2-6597-4879-818c-7d0748094e3e" (UID: "9ac4d0d2-6597-4879-818c-7d0748094e3e"). InnerVolumeSpecName "kube-api-access-k59sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.521416 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k59sp\" (UniqueName: \"kubernetes.io/projected/9ac4d0d2-6597-4879-818c-7d0748094e3e-kube-api-access-k59sp\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.521458 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz29j\" (UniqueName: \"kubernetes.io/projected/30cc69b2-dc90-486e-8d87-3ff906ee4288-kube-api-access-jz29j\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.865516 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83c1-account-create-td9vd" event={"ID":"9ac4d0d2-6597-4879-818c-7d0748094e3e","Type":"ContainerDied","Data":"9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0"} Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.865570 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb1873dbae362e118d6189d15caf038e25e27c9284fe338282b41d1acec64d0" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.866838 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83c1-account-create-td9vd" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.867428 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d62-account-create-dnmtw" event={"ID":"30cc69b2-dc90-486e-8d87-3ff906ee4288","Type":"ContainerDied","Data":"7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470"} Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.867495 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d62-account-create-dnmtw" Sep 29 10:04:35 crc kubenswrapper[4891]: I0929 10:04:35.867498 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7050f1dff6d79ecfefaa8481dcf4236b1ad9128a1a949d5c03f84e0f3fc16470" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.185904 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.186291 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.186352 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.187137 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.187206 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6" gracePeriod=600 Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.197617 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.344979 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpx8n\" (UniqueName: \"kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n\") pod \"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80\" (UID: \"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80\") " Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.352100 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n" (OuterVolumeSpecName: "kube-api-access-bpx8n") pod "84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" (UID: "84f29b0d-c5e6-48ea-b1b0-01649ee7ec80"). InnerVolumeSpecName "kube-api-access-bpx8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.447633 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpx8n\" (UniqueName: \"kubernetes.io/projected/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80-kube-api-access-bpx8n\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.892054 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c87a-account-create-zvx6w" event={"ID":"84f29b0d-c5e6-48ea-b1b0-01649ee7ec80","Type":"ContainerDied","Data":"fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232"} Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.892133 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3f4e5979dad1f9cb7bf091cd114e403cbca9d0cc63de710f63662b28cf0232" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.892185 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c87a-account-create-zvx6w" Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.898766 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6" exitCode=0 Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.898916 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6"} Sep 29 10:04:36 crc kubenswrapper[4891]: I0929 10:04:36.899021 4891 scope.go:117] "RemoveContainer" containerID="c84275b86dc57120809c449b950b94bd4d98c74f39bd83b524ef83f4962bef20" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.265854 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jq4xk" podUID="7484acb7-f4b2-417b-a478-86b8c5999c34" containerName="ovn-controller" probeResult="failure" output=< Sep 29 10:04:37 crc kubenswrapper[4891]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 29 10:04:37 crc kubenswrapper[4891]: > Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.466502 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.484346 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66ec83ce-b4c6-412b-b7c4-6a61c6914c0e-etc-swift\") pod \"swift-storage-0\" (UID: \"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e\") " pod="openstack/swift-storage-0" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.571526 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.584430 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.588576 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vxc7p" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.856193 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jq4xk-config-w79sg"] Sep 29 10:04:37 crc kubenswrapper[4891]: E0929 10:04:37.856999 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857022 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: E0929 10:04:37.857039 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cc69b2-dc90-486e-8d87-3ff906ee4288" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857047 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cc69b2-dc90-486e-8d87-3ff906ee4288" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: E0929 10:04:37.857079 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4d0d2-6597-4879-818c-7d0748094e3e" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857086 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4d0d2-6597-4879-818c-7d0748094e3e" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857271 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857292 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cc69b2-dc90-486e-8d87-3ff906ee4288" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.857313 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac4d0d2-6597-4879-818c-7d0748094e3e" containerName="mariadb-account-create" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.858008 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.866418 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.873389 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jq4xk-config-w79sg"] Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.909490 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07"} Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975164 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lff42\" (UniqueName: \"kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975238 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975265 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975307 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975336 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:37 crc kubenswrapper[4891]: I0929 10:04:37.975390 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077241 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lff42\" (UniqueName: \"kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077405 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077454 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077556 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077618 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.077859 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.078025 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.078030 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.078834 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.078903 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.080438 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.106243 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lff42\" (UniqueName: \"kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42\") pod \"ovn-controller-jq4xk-config-w79sg\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.162520 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-cp9f2"] Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.163905 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.166084 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.166174 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cck6l" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.172028 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cp9f2"] Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.198588 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.248685 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.280947 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.281065 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.281107 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.281178 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jbw\" (UniqueName: \"kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.382294 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.382385 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.382406 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.382451 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jbw\" (UniqueName: \"kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.392242 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.392540 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.392640 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.398871 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jbw\" (UniqueName: \"kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw\") pod \"glance-db-sync-cp9f2\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.482641 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.758966 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jq4xk-config-w79sg"] Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.924526 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jq4xk-config-w79sg" event={"ID":"7188ad19-b124-4699-b972-18f034bc40e7","Type":"ContainerStarted","Data":"9c6f11a699e0556ff76059b00ef194964fafae0dac89a9bd10d52a418cae7d7c"} Sep 29 10:04:38 crc kubenswrapper[4891]: I0929 10:04:38.930350 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"69ec343e1199e98bca5bbe1b2914ffc50044732b735b1585212892bbf1ba99d7"} Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.057724 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-cp9f2"] Sep 29 10:04:39 crc kubenswrapper[4891]: W0929 10:04:39.061850 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11c99c95_cf39_4352_b4c0_25c0ac4b6465.slice/crio-3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d WatchSource:0}: Error finding container 3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d: Status 404 returned error can't find the container with id 3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.955107 4891 generic.go:334] "Generic (PLEG): container finished" podID="7188ad19-b124-4699-b972-18f034bc40e7" containerID="4aa39b8e2d1af6216ca011bd793c036285d180bacb8213ef333c981ae198cc94" exitCode=0 Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.955219 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jq4xk-config-w79sg" event={"ID":"7188ad19-b124-4699-b972-18f034bc40e7","Type":"ContainerDied","Data":"4aa39b8e2d1af6216ca011bd793c036285d180bacb8213ef333c981ae198cc94"} Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.958934 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cp9f2" event={"ID":"11c99c95-cf39-4352-b4c0-25c0ac4b6465","Type":"ContainerStarted","Data":"3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d"} Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.977164 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"c4f8799d998ab30fabd6744418f0702a63a5e54a72d4ac8437db2c9bd41d14c4"} Sep 29 10:04:39 crc kubenswrapper[4891]: I0929 10:04:39.977218 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"17464bc26527fc6958c329fc60b36bfb3ebc9b18b638192203c7f370466fab1d"} Sep 29 10:04:40 crc kubenswrapper[4891]: I0929 10:04:40.989735 4891 generic.go:334] "Generic (PLEG): container finished" podID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerID="f35fd94d86fef28eca621ae9c5c625c1d28e5bd5c8a371383e8e27de6992ac12" exitCode=0 Sep 29 10:04:40 crc kubenswrapper[4891]: I0929 10:04:40.990209 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerDied","Data":"f35fd94d86fef28eca621ae9c5c625c1d28e5bd5c8a371383e8e27de6992ac12"} Sep 29 10:04:40 crc kubenswrapper[4891]: I0929 10:04:40.999858 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"724aa2a6775561d2b8c77cfcca881e1a1dd8b5bc1a23682c438784f91f42443f"} Sep 29 10:04:40 crc kubenswrapper[4891]: I0929 10:04:40.999900 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"64e559692175e17bd4e5532c6d691507cf351ad517c74c969ed92863f0c797c9"} Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.323976 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.442924 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443021 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lff42\" (UniqueName: \"kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443070 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443122 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443170 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443183 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443260 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run\") pod \"7188ad19-b124-4699-b972-18f034bc40e7\" (UID: \"7188ad19-b124-4699-b972-18f034bc40e7\") " Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443437 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443500 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run" (OuterVolumeSpecName: "var-run") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443734 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443958 4891 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443982 4891 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443987 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts" (OuterVolumeSpecName: "scripts") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.443999 4891 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.444017 4891 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7188ad19-b124-4699-b972-18f034bc40e7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.451953 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42" (OuterVolumeSpecName: "kube-api-access-lff42") pod "7188ad19-b124-4699-b972-18f034bc40e7" (UID: "7188ad19-b124-4699-b972-18f034bc40e7"). InnerVolumeSpecName "kube-api-access-lff42". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.545558 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lff42\" (UniqueName: \"kubernetes.io/projected/7188ad19-b124-4699-b972-18f034bc40e7-kube-api-access-lff42\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:41 crc kubenswrapper[4891]: I0929 10:04:41.545601 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7188ad19-b124-4699-b972-18f034bc40e7-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.008642 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jq4xk-config-w79sg" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.008655 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jq4xk-config-w79sg" event={"ID":"7188ad19-b124-4699-b972-18f034bc40e7","Type":"ContainerDied","Data":"9c6f11a699e0556ff76059b00ef194964fafae0dac89a9bd10d52a418cae7d7c"} Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.009224 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c6f11a699e0556ff76059b00ef194964fafae0dac89a9bd10d52a418cae7d7c" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.011328 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerStarted","Data":"65a17fedb0791dbd519736c2d594ac4926635a0b555f8613661cd2db2bb0554b"} Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.012178 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.037911 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371961.816887 podStartE2EDuration="1m15.037888454s" podCreationTimestamp="2025-09-29 10:03:27 +0000 UTC" firstStartedPulling="2025-09-29 10:03:29.923610418 +0000 UTC m=+940.128778739" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:42.037616206 +0000 UTC m=+1012.242784527" watchObservedRunningTime="2025-09-29 10:04:42.037888454 +0000 UTC m=+1012.243056805" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.281498 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jq4xk" Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.460663 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jq4xk-config-w79sg"] Sep 29 10:04:42 crc kubenswrapper[4891]: I0929 10:04:42.476129 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jq4xk-config-w79sg"] Sep 29 10:04:43 crc kubenswrapper[4891]: I0929 10:04:43.031736 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"fd8975cc2926aa2a3c5a083d25f6c8342fab9a75750a128508efa236011f251c"} Sep 29 10:04:44 crc kubenswrapper[4891]: I0929 10:04:44.043445 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"0732655d07b617960cd608145804717d327b526f9351e6165c5431843e6d6095"} Sep 29 10:04:44 crc kubenswrapper[4891]: I0929 10:04:44.043888 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"f8af0a3dc8e615cbd2e085128c61a453091e24a71b235ff18411e037d9b96619"} Sep 29 10:04:44 crc kubenswrapper[4891]: I0929 10:04:44.043905 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"6439570f877341a4f5bb341f361b685b1ba10ec95b6b2ca3273817cf50696fe3"} Sep 29 10:04:44 crc kubenswrapper[4891]: I0929 10:04:44.414714 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7188ad19-b124-4699-b972-18f034bc40e7" path="/var/lib/kubelet/pods/7188ad19-b124-4699-b972-18f034bc40e7/volumes" Sep 29 10:04:49 crc kubenswrapper[4891]: I0929 10:04:49.602069 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:04:51 crc kubenswrapper[4891]: I0929 10:04:51.114316 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"55e80b66b2c62f920c3da219d077f646003050619b6058a3c6bd0be11ad2a8c4"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.128229 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cp9f2" event={"ID":"11c99c95-cf39-4352-b4c0-25c0ac4b6465","Type":"ContainerStarted","Data":"0637e2fb45d8c9d1f0ccc2cb7817d4bd647859ebe52a35b0cbc9eee3a3593351"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136712 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"50482bc65771d9f901973015be3eba06d8267c32e7043369f4d18388a4d11fb4"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136750 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"a841fbe855bf7ad94ea3ba8a4e890f266d9af40251b98de1ca29997209fe41d8"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136759 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"c4884144e2f96b0a6b2b7029365e69b266a2ee09baaacffaf83ee161cc078a79"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136771 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"d02f8bd14166fd16dbdcaa5dd2957c0de3e7cbfe8e2aa5e717fd945ed65f3527"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136779 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"d6f5a2d21e9f676078e4297c5bbe9e352599aff5ac703e53361b2cbd02ee978a"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.136806 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"66ec83ce-b4c6-412b-b7c4-6a61c6914c0e","Type":"ContainerStarted","Data":"400a77f57a67a72423776c98e20acb853429b598759fc1df811cc4e667a06369"} Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.147939 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-cp9f2" podStartSLOduration=2.4915733429999998 podStartE2EDuration="14.147920766s" podCreationTimestamp="2025-09-29 10:04:38 +0000 UTC" firstStartedPulling="2025-09-29 10:04:39.064542006 +0000 UTC m=+1009.269710327" lastFinishedPulling="2025-09-29 10:04:50.720889429 +0000 UTC m=+1020.926057750" observedRunningTime="2025-09-29 10:04:52.14499395 +0000 UTC m=+1022.350162271" watchObservedRunningTime="2025-09-29 10:04:52.147920766 +0000 UTC m=+1022.353089087" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.179812 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.786319561 podStartE2EDuration="48.179780532s" podCreationTimestamp="2025-09-29 10:04:04 +0000 UTC" firstStartedPulling="2025-09-29 10:04:38.248172111 +0000 UTC m=+1008.453340432" lastFinishedPulling="2025-09-29 10:04:50.641633042 +0000 UTC m=+1020.846801403" observedRunningTime="2025-09-29 10:04:52.174816096 +0000 UTC m=+1022.379984427" watchObservedRunningTime="2025-09-29 10:04:52.179780532 +0000 UTC m=+1022.384948853" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.477709 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:04:52 crc kubenswrapper[4891]: E0929 10:04:52.478087 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7188ad19-b124-4699-b972-18f034bc40e7" containerName="ovn-config" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.478100 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7188ad19-b124-4699-b972-18f034bc40e7" containerName="ovn-config" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.478308 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7188ad19-b124-4699-b972-18f034bc40e7" containerName="ovn-config" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.479229 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.481094 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.498596 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.585640 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.585749 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh8q6\" (UniqueName: \"kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.585833 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.585930 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.585967 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.586034 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.687662 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.687852 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.687944 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.688021 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.688086 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.688184 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh8q6\" (UniqueName: \"kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.689589 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.690271 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.690475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.691328 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.691398 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.717457 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh8q6\" (UniqueName: \"kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6\") pod \"dnsmasq-dns-6d5b6d6b67-dzhx8\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:52 crc kubenswrapper[4891]: I0929 10:04:52.804219 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:53 crc kubenswrapper[4891]: I0929 10:04:53.338707 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:04:53 crc kubenswrapper[4891]: W0929 10:04:53.347546 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b09959_1535_4dd2_b02c_55cf144f52c2.slice/crio-97a588b151a44a2bdedd05f522f798d24be60cdac3a95d968d996ade7c007359 WatchSource:0}: Error finding container 97a588b151a44a2bdedd05f522f798d24be60cdac3a95d968d996ade7c007359: Status 404 returned error can't find the container with id 97a588b151a44a2bdedd05f522f798d24be60cdac3a95d968d996ade7c007359 Sep 29 10:04:54 crc kubenswrapper[4891]: I0929 10:04:54.166340 4891 generic.go:334] "Generic (PLEG): container finished" podID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerID="ed79373bb3c7627116c9206b6aed4ad6969a2f197c2a61918f106b38956d2d9f" exitCode=0 Sep 29 10:04:54 crc kubenswrapper[4891]: I0929 10:04:54.166488 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" event={"ID":"99b09959-1535-4dd2-b02c-55cf144f52c2","Type":"ContainerDied","Data":"ed79373bb3c7627116c9206b6aed4ad6969a2f197c2a61918f106b38956d2d9f"} Sep 29 10:04:54 crc kubenswrapper[4891]: I0929 10:04:54.167180 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" event={"ID":"99b09959-1535-4dd2-b02c-55cf144f52c2","Type":"ContainerStarted","Data":"97a588b151a44a2bdedd05f522f798d24be60cdac3a95d968d996ade7c007359"} Sep 29 10:04:55 crc kubenswrapper[4891]: I0929 10:04:55.182386 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" event={"ID":"99b09959-1535-4dd2-b02c-55cf144f52c2","Type":"ContainerStarted","Data":"5b068b1b82c8a36a28e341154de4113cc8f1b91c349b0148239a87acd01b6f97"} Sep 29 10:04:55 crc kubenswrapper[4891]: I0929 10:04:55.182877 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:55 crc kubenswrapper[4891]: I0929 10:04:55.207764 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" podStartSLOduration=3.20754689 podStartE2EDuration="3.20754689s" podCreationTimestamp="2025-09-29 10:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:55.204337585 +0000 UTC m=+1025.409505996" watchObservedRunningTime="2025-09-29 10:04:55.20754689 +0000 UTC m=+1025.412715251" Sep 29 10:04:57 crc kubenswrapper[4891]: I0929 10:04:57.209758 4891 generic.go:334] "Generic (PLEG): container finished" podID="11c99c95-cf39-4352-b4c0-25c0ac4b6465" containerID="0637e2fb45d8c9d1f0ccc2cb7817d4bd647859ebe52a35b0cbc9eee3a3593351" exitCode=0 Sep 29 10:04:57 crc kubenswrapper[4891]: I0929 10:04:57.209963 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cp9f2" event={"ID":"11c99c95-cf39-4352-b4c0-25c0ac4b6465","Type":"ContainerDied","Data":"0637e2fb45d8c9d1f0ccc2cb7817d4bd647859ebe52a35b0cbc9eee3a3593351"} Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.756212 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.836229 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle\") pod \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.836662 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data\") pod \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.836690 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6jbw\" (UniqueName: \"kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw\") pod \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.836738 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data\") pod \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\" (UID: \"11c99c95-cf39-4352-b4c0-25c0ac4b6465\") " Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.842749 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "11c99c95-cf39-4352-b4c0-25c0ac4b6465" (UID: "11c99c95-cf39-4352-b4c0-25c0ac4b6465"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.842889 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw" (OuterVolumeSpecName: "kube-api-access-z6jbw") pod "11c99c95-cf39-4352-b4c0-25c0ac4b6465" (UID: "11c99c95-cf39-4352-b4c0-25c0ac4b6465"). InnerVolumeSpecName "kube-api-access-z6jbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.861183 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11c99c95-cf39-4352-b4c0-25c0ac4b6465" (UID: "11c99c95-cf39-4352-b4c0-25c0ac4b6465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.896954 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data" (OuterVolumeSpecName: "config-data") pod "11c99c95-cf39-4352-b4c0-25c0ac4b6465" (UID: "11c99c95-cf39-4352-b4c0-25c0ac4b6465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.939745 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.939819 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.939832 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6jbw\" (UniqueName: \"kubernetes.io/projected/11c99c95-cf39-4352-b4c0-25c0ac4b6465-kube-api-access-z6jbw\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:58 crc kubenswrapper[4891]: I0929 10:04:58.939856 4891 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11c99c95-cf39-4352-b4c0-25c0ac4b6465-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.256384 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-cp9f2" event={"ID":"11c99c95-cf39-4352-b4c0-25c0ac4b6465","Type":"ContainerDied","Data":"3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d"} Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.256448 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aaf544afa4f5c9a846718e376b05065a51fc087c11fe32b37866c318501848d" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.256825 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-cp9f2" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.325128 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.717553 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2bp6p"] Sep 29 10:04:59 crc kubenswrapper[4891]: E0929 10:04:59.725369 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c99c95-cf39-4352-b4c0-25c0ac4b6465" containerName="glance-db-sync" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.725392 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c99c95-cf39-4352-b4c0-25c0ac4b6465" containerName="glance-db-sync" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.725584 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c99c95-cf39-4352-b4c0-25c0ac4b6465" containerName="glance-db-sync" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.726393 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2bp6p" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.742118 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2bp6p"] Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.835097 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-58p75"] Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.836285 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-58p75" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.857398 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4bh\" (UniqueName: \"kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh\") pod \"cinder-db-create-2bp6p\" (UID: \"9b62e005-e6a8-4385-9795-54b88491fab1\") " pod="openstack/cinder-db-create-2bp6p" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.872804 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-58p75"] Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.889270 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.889531 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="dnsmasq-dns" containerID="cri-o://5b068b1b82c8a36a28e341154de4113cc8f1b91c349b0148239a87acd01b6f97" gracePeriod=10 Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.892623 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.959220 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4dp\" (UniqueName: \"kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp\") pod \"barbican-db-create-58p75\" (UID: \"470b84f7-4a21-43c5-8770-f252f3e9bf6c\") " pod="openstack/barbican-db-create-58p75" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.959654 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4bh\" (UniqueName: \"kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh\") pod \"cinder-db-create-2bp6p\" (UID: \"9b62e005-e6a8-4385-9795-54b88491fab1\") " pod="openstack/cinder-db-create-2bp6p" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.965873 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.967444 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:04:59 crc kubenswrapper[4891]: I0929 10:04:59.985641 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.021582 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4bh\" (UniqueName: \"kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh\") pod \"cinder-db-create-2bp6p\" (UID: \"9b62e005-e6a8-4385-9795-54b88491fab1\") " pod="openstack/cinder-db-create-2bp6p" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.071473 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2bp6p" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074560 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074671 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz8v5\" (UniqueName: \"kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074723 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4dp\" (UniqueName: \"kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp\") pod \"barbican-db-create-58p75\" (UID: \"470b84f7-4a21-43c5-8770-f252f3e9bf6c\") " pod="openstack/barbican-db-create-58p75" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074852 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074892 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074919 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.074945 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.140641 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4dp\" (UniqueName: \"kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp\") pod \"barbican-db-create-58p75\" (UID: \"470b84f7-4a21-43c5-8770-f252f3e9bf6c\") " pod="openstack/barbican-db-create-58p75" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.142863 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9dsgv"] Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.156006 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-58p75" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.157346 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9dsgv"] Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.157477 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177629 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177684 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177708 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177726 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177765 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177828 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz8v5\" (UniqueName: \"kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.177876 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dbb\" (UniqueName: \"kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb\") pod \"neutron-db-create-9dsgv\" (UID: \"97300203-1ba7-435f-9567-db7ebe1f6234\") " pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.179916 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.180553 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.180972 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.181047 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.181155 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.218091 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz8v5\" (UniqueName: \"kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5\") pod \"dnsmasq-dns-895cf5cf-r7nq7\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.237597 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2bbjd"] Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.243771 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.251655 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-trmdq" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.251933 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.252043 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.252246 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.256677 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.258589 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2bbjd"] Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.284114 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dbb\" (UniqueName: \"kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb\") pod \"neutron-db-create-9dsgv\" (UID: \"97300203-1ba7-435f-9567-db7ebe1f6234\") " pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.312813 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dbb\" (UniqueName: \"kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb\") pod \"neutron-db-create-9dsgv\" (UID: \"97300203-1ba7-435f-9567-db7ebe1f6234\") " pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.315282 4891 generic.go:334] "Generic (PLEG): container finished" podID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerID="5b068b1b82c8a36a28e341154de4113cc8f1b91c349b0148239a87acd01b6f97" exitCode=0 Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.315318 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" event={"ID":"99b09959-1535-4dd2-b02c-55cf144f52c2","Type":"ContainerDied","Data":"5b068b1b82c8a36a28e341154de4113cc8f1b91c349b0148239a87acd01b6f97"} Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.386640 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg27r\" (UniqueName: \"kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.387142 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.387177 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.490025 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.490092 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.490158 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg27r\" (UniqueName: \"kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.497202 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.497214 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.518530 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg27r\" (UniqueName: \"kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r\") pod \"keystone-db-sync-2bbjd\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.544501 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.567837 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.590420 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.693619 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh8q6\" (UniqueName: \"kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.693702 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.693894 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.693956 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.694005 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.694121 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0\") pod \"99b09959-1535-4dd2-b02c-55cf144f52c2\" (UID: \"99b09959-1535-4dd2-b02c-55cf144f52c2\") " Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.710281 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6" (OuterVolumeSpecName: "kube-api-access-vh8q6") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "kube-api-access-vh8q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.768133 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.790623 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.797621 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.797667 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.797681 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh8q6\" (UniqueName: \"kubernetes.io/projected/99b09959-1535-4dd2-b02c-55cf144f52c2-kube-api-access-vh8q6\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.806438 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config" (OuterVolumeSpecName: "config") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.845531 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.846768 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99b09959-1535-4dd2-b02c-55cf144f52c2" (UID: "99b09959-1535-4dd2-b02c-55cf144f52c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.901777 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.901822 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.901833 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99b09959-1535-4dd2-b02c-55cf144f52c2-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:00 crc kubenswrapper[4891]: I0929 10:05:00.908236 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2bp6p"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.001824 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-58p75"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.137543 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.328922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" event={"ID":"a59c7eb8-6677-4725-8a96-6920e1b84c83","Type":"ContainerStarted","Data":"e013818abd86e368764516151f9b80e0205a5be5a583f43b644b6c846988fcc8"} Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.337770 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2bp6p" event={"ID":"9b62e005-e6a8-4385-9795-54b88491fab1","Type":"ContainerStarted","Data":"b0a01c40319efddea2052f2f8f2a4f20ce12077e5194ffc6591db973087c5a87"} Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.343004 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" event={"ID":"99b09959-1535-4dd2-b02c-55cf144f52c2","Type":"ContainerDied","Data":"97a588b151a44a2bdedd05f522f798d24be60cdac3a95d968d996ade7c007359"} Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.343163 4891 scope.go:117] "RemoveContainer" containerID="5b068b1b82c8a36a28e341154de4113cc8f1b91c349b0148239a87acd01b6f97" Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.343336 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-dzhx8" Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.347484 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-58p75" event={"ID":"470b84f7-4a21-43c5-8770-f252f3e9bf6c","Type":"ContainerStarted","Data":"ae390f5abaccfee4ee4d3d0f0c3a5d8ee22124a93a7340f2ac4a0f507cf2bd28"} Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.390182 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2bbjd"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.400834 4891 scope.go:117] "RemoveContainer" containerID="ed79373bb3c7627116c9206b6aed4ad6969a2f197c2a61918f106b38956d2d9f" Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.403525 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9dsgv"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.413448 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:05:01 crc kubenswrapper[4891]: I0929 10:05:01.421211 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-dzhx8"] Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.358667 4891 generic.go:334] "Generic (PLEG): container finished" podID="470b84f7-4a21-43c5-8770-f252f3e9bf6c" containerID="8b51737e23f5d950b90347cf88885c6bf6d7bb7ecbf315ca425c32e06843c4dd" exitCode=0 Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.358735 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-58p75" event={"ID":"470b84f7-4a21-43c5-8770-f252f3e9bf6c","Type":"ContainerDied","Data":"8b51737e23f5d950b90347cf88885c6bf6d7bb7ecbf315ca425c32e06843c4dd"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.361384 4891 generic.go:334] "Generic (PLEG): container finished" podID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerID="3b1fd89b22e9b20a14712190d1a12ba9f8a562012c4c0e299f2df9f9e13a268b" exitCode=0 Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.361446 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" event={"ID":"a59c7eb8-6677-4725-8a96-6920e1b84c83","Type":"ContainerDied","Data":"3b1fd89b22e9b20a14712190d1a12ba9f8a562012c4c0e299f2df9f9e13a268b"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.362652 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bbjd" event={"ID":"7e83cc3b-a436-4846-9246-1e1dec8e85cc","Type":"ContainerStarted","Data":"d7624ace97852f907edec6d21489259f92c724e4fd4f4bef94bd1b92cc44940e"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.365950 4891 generic.go:334] "Generic (PLEG): container finished" podID="97300203-1ba7-435f-9567-db7ebe1f6234" containerID="090bddd3e5abebc806e3514e75503344a5b073460904eb800faf5ff00037864f" exitCode=0 Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.366012 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9dsgv" event={"ID":"97300203-1ba7-435f-9567-db7ebe1f6234","Type":"ContainerDied","Data":"090bddd3e5abebc806e3514e75503344a5b073460904eb800faf5ff00037864f"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.366081 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9dsgv" event={"ID":"97300203-1ba7-435f-9567-db7ebe1f6234","Type":"ContainerStarted","Data":"a9f0978e2f654e3d0d109d81fe1ad6e69cfd6f10d8eb208aba6d4ec294b809c5"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.368416 4891 generic.go:334] "Generic (PLEG): container finished" podID="9b62e005-e6a8-4385-9795-54b88491fab1" containerID="c81f78241212772aa95f0f2d22a048f7b7f9bd391b893af6e6ff9b7f7c2cb6fa" exitCode=0 Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.368467 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2bp6p" event={"ID":"9b62e005-e6a8-4385-9795-54b88491fab1","Type":"ContainerDied","Data":"c81f78241212772aa95f0f2d22a048f7b7f9bd391b893af6e6ff9b7f7c2cb6fa"} Sep 29 10:05:02 crc kubenswrapper[4891]: I0929 10:05:02.421517 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" path="/var/lib/kubelet/pods/99b09959-1535-4dd2-b02c-55cf144f52c2/volumes" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.379065 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" event={"ID":"a59c7eb8-6677-4725-8a96-6920e1b84c83","Type":"ContainerStarted","Data":"ebcfbfb0783dfbcba0948383f93b72cc28f36b28cf9522ee0cba3a18c02b544b"} Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.379741 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.412695 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" podStartSLOduration=4.41267315 podStartE2EDuration="4.41267315s" podCreationTimestamp="2025-09-29 10:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:03.403388817 +0000 UTC m=+1033.608557138" watchObservedRunningTime="2025-09-29 10:05:03.41267315 +0000 UTC m=+1033.617841471" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.796342 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-58p75" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.873027 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf4dp\" (UniqueName: \"kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp\") pod \"470b84f7-4a21-43c5-8770-f252f3e9bf6c\" (UID: \"470b84f7-4a21-43c5-8770-f252f3e9bf6c\") " Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.881210 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp" (OuterVolumeSpecName: "kube-api-access-hf4dp") pod "470b84f7-4a21-43c5-8770-f252f3e9bf6c" (UID: "470b84f7-4a21-43c5-8770-f252f3e9bf6c"). InnerVolumeSpecName "kube-api-access-hf4dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.944697 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.958492 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2bp6p" Sep 29 10:05:03 crc kubenswrapper[4891]: I0929 10:05:03.975707 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf4dp\" (UniqueName: \"kubernetes.io/projected/470b84f7-4a21-43c5-8770-f252f3e9bf6c-kube-api-access-hf4dp\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.076643 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dbb\" (UniqueName: \"kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb\") pod \"97300203-1ba7-435f-9567-db7ebe1f6234\" (UID: \"97300203-1ba7-435f-9567-db7ebe1f6234\") " Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.076822 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4bh\" (UniqueName: \"kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh\") pod \"9b62e005-e6a8-4385-9795-54b88491fab1\" (UID: \"9b62e005-e6a8-4385-9795-54b88491fab1\") " Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.080556 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh" (OuterVolumeSpecName: "kube-api-access-pd4bh") pod "9b62e005-e6a8-4385-9795-54b88491fab1" (UID: "9b62e005-e6a8-4385-9795-54b88491fab1"). InnerVolumeSpecName "kube-api-access-pd4bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.081091 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb" (OuterVolumeSpecName: "kube-api-access-d4dbb") pod "97300203-1ba7-435f-9567-db7ebe1f6234" (UID: "97300203-1ba7-435f-9567-db7ebe1f6234"). InnerVolumeSpecName "kube-api-access-d4dbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.179043 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dbb\" (UniqueName: \"kubernetes.io/projected/97300203-1ba7-435f-9567-db7ebe1f6234-kube-api-access-d4dbb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.179090 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4bh\" (UniqueName: \"kubernetes.io/projected/9b62e005-e6a8-4385-9795-54b88491fab1-kube-api-access-pd4bh\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.392605 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9dsgv" event={"ID":"97300203-1ba7-435f-9567-db7ebe1f6234","Type":"ContainerDied","Data":"a9f0978e2f654e3d0d109d81fe1ad6e69cfd6f10d8eb208aba6d4ec294b809c5"} Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.392658 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f0978e2f654e3d0d109d81fe1ad6e69cfd6f10d8eb208aba6d4ec294b809c5" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.392707 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9dsgv" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.394473 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2bp6p" event={"ID":"9b62e005-e6a8-4385-9795-54b88491fab1","Type":"ContainerDied","Data":"b0a01c40319efddea2052f2f8f2a4f20ce12077e5194ffc6591db973087c5a87"} Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.394498 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a01c40319efddea2052f2f8f2a4f20ce12077e5194ffc6591db973087c5a87" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.394523 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2bp6p" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.397188 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-58p75" Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.416450 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-58p75" event={"ID":"470b84f7-4a21-43c5-8770-f252f3e9bf6c","Type":"ContainerDied","Data":"ae390f5abaccfee4ee4d3d0f0c3a5d8ee22124a93a7340f2ac4a0f507cf2bd28"} Sep 29 10:05:04 crc kubenswrapper[4891]: I0929 10:05:04.416503 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae390f5abaccfee4ee4d3d0f0c3a5d8ee22124a93a7340f2ac4a0f507cf2bd28" Sep 29 10:05:08 crc kubenswrapper[4891]: I0929 10:05:08.456935 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bbjd" event={"ID":"7e83cc3b-a436-4846-9246-1e1dec8e85cc","Type":"ContainerStarted","Data":"abf731f434a664062034377e3969182518ca28e110d4516277c631c1140b6297"} Sep 29 10:05:08 crc kubenswrapper[4891]: I0929 10:05:08.499699 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2bbjd" podStartSLOduration=2.756701339 podStartE2EDuration="8.499670651s" podCreationTimestamp="2025-09-29 10:05:00 +0000 UTC" firstStartedPulling="2025-09-29 10:05:01.400295542 +0000 UTC m=+1031.605463863" lastFinishedPulling="2025-09-29 10:05:07.143264844 +0000 UTC m=+1037.348433175" observedRunningTime="2025-09-29 10:05:08.482980327 +0000 UTC m=+1038.688148688" watchObservedRunningTime="2025-09-29 10:05:08.499670651 +0000 UTC m=+1038.704839012" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.830177 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8a72-account-create-wqxzw"] Sep 29 10:05:09 crc kubenswrapper[4891]: E0929 10:05:09.831145 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97300203-1ba7-435f-9567-db7ebe1f6234" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831166 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="97300203-1ba7-435f-9567-db7ebe1f6234" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: E0929 10:05:09.831199 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b62e005-e6a8-4385-9795-54b88491fab1" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831207 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b62e005-e6a8-4385-9795-54b88491fab1" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: E0929 10:05:09.831223 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="dnsmasq-dns" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831232 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="dnsmasq-dns" Sep 29 10:05:09 crc kubenswrapper[4891]: E0929 10:05:09.831255 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470b84f7-4a21-43c5-8770-f252f3e9bf6c" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831264 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="470b84f7-4a21-43c5-8770-f252f3e9bf6c" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: E0929 10:05:09.831287 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="init" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831295 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="init" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831509 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b09959-1535-4dd2-b02c-55cf144f52c2" containerName="dnsmasq-dns" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831523 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b62e005-e6a8-4385-9795-54b88491fab1" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831542 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="97300203-1ba7-435f-9567-db7ebe1f6234" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.831555 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="470b84f7-4a21-43c5-8770-f252f3e9bf6c" containerName="mariadb-database-create" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.832344 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.842298 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8a72-account-create-wqxzw"] Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.883202 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 29 10:05:09 crc kubenswrapper[4891]: I0929 10:05:09.927463 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsgs\" (UniqueName: \"kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs\") pod \"barbican-8a72-account-create-wqxzw\" (UID: \"7f582bd0-7e30-4fef-8623-9d0482c4aa7b\") " pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.008177 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ccba-account-create-cdq25"] Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.009456 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.015569 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.036473 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c9w\" (UniqueName: \"kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w\") pod \"neutron-ccba-account-create-cdq25\" (UID: \"9f670344-0a7d-4c50-ad4d-e00195b3f232\") " pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.036583 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsgs\" (UniqueName: \"kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs\") pod \"barbican-8a72-account-create-wqxzw\" (UID: \"7f582bd0-7e30-4fef-8623-9d0482c4aa7b\") " pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.038732 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ccba-account-create-cdq25"] Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.062178 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsgs\" (UniqueName: \"kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs\") pod \"barbican-8a72-account-create-wqxzw\" (UID: \"7f582bd0-7e30-4fef-8623-9d0482c4aa7b\") " pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.138134 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c9w\" (UniqueName: \"kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w\") pod \"neutron-ccba-account-create-cdq25\" (UID: \"9f670344-0a7d-4c50-ad4d-e00195b3f232\") " pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.153972 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c9w\" (UniqueName: \"kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w\") pod \"neutron-ccba-account-create-cdq25\" (UID: \"9f670344-0a7d-4c50-ad4d-e00195b3f232\") " pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.195377 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.259069 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.344166 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.345001 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="dnsmasq-dns" containerID="cri-o://11d02efd4b91519f57a0828c04b345f86fcb0f8b36b5fdf2e95bfed3c64949d0" gracePeriod=10 Sep 29 10:05:10 crc kubenswrapper[4891]: I0929 10:05:10.348194 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:10.721172 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8a72-account-create-wqxzw"] Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:10.826199 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ccba-account-create-cdq25"] Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.505677 4891 generic.go:334] "Generic (PLEG): container finished" podID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerID="11d02efd4b91519f57a0828c04b345f86fcb0f8b36b5fdf2e95bfed3c64949d0" exitCode=0 Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.506168 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" event={"ID":"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897","Type":"ContainerDied","Data":"11d02efd4b91519f57a0828c04b345f86fcb0f8b36b5fdf2e95bfed3c64949d0"} Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.509753 4891 generic.go:334] "Generic (PLEG): container finished" podID="9f670344-0a7d-4c50-ad4d-e00195b3f232" containerID="77b59f7de13b33b0804cfbec8ef4aab4efbb2e8a838fa795c94826f8fc46bc5e" exitCode=0 Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.509902 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ccba-account-create-cdq25" event={"ID":"9f670344-0a7d-4c50-ad4d-e00195b3f232","Type":"ContainerDied","Data":"77b59f7de13b33b0804cfbec8ef4aab4efbb2e8a838fa795c94826f8fc46bc5e"} Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.509926 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ccba-account-create-cdq25" event={"ID":"9f670344-0a7d-4c50-ad4d-e00195b3f232","Type":"ContainerStarted","Data":"8d63e1fce488dd69e16c6185712ef38f8d5ff88e3d5c6b24ab7f87c5a2096394"} Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.515617 4891 generic.go:334] "Generic (PLEG): container finished" podID="7f582bd0-7e30-4fef-8623-9d0482c4aa7b" containerID="bfb9fcc0ae82a908049fcb06cc95c98d70cc4cf7fcdaf9366bb6d51bdc60cde4" exitCode=0 Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.515670 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8a72-account-create-wqxzw" event={"ID":"7f582bd0-7e30-4fef-8623-9d0482c4aa7b","Type":"ContainerDied","Data":"bfb9fcc0ae82a908049fcb06cc95c98d70cc4cf7fcdaf9366bb6d51bdc60cde4"} Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.515758 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8a72-account-create-wqxzw" event={"ID":"7f582bd0-7e30-4fef-8623-9d0482c4aa7b","Type":"ContainerStarted","Data":"33dd60219716ee4972419c387693fd10c307aa1e9f1d7e91e72db6e34cea1a1e"} Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.786939 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.977421 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc\") pod \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.977818 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb\") pod \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.977843 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config\") pod \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.977905 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb\") pod \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.977990 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6scf\" (UniqueName: \"kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf\") pod \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\" (UID: \"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897\") " Sep 29 10:05:11 crc kubenswrapper[4891]: I0929 10:05:11.991086 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf" (OuterVolumeSpecName: "kube-api-access-z6scf") pod "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" (UID: "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897"). InnerVolumeSpecName "kube-api-access-z6scf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.028135 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" (UID: "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.029374 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config" (OuterVolumeSpecName: "config") pod "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" (UID: "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.032369 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" (UID: "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.036245 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" (UID: "e8960beb-b8cf-4c19-9ae8-1a8ce3f52897"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.079758 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.079807 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.079818 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.079826 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.079840 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6scf\" (UniqueName: \"kubernetes.io/projected/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897-kube-api-access-z6scf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.529005 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" event={"ID":"e8960beb-b8cf-4c19-9ae8-1a8ce3f52897","Type":"ContainerDied","Data":"488268ea32d4ca9973cd600839d9a09ca3933f4c08e836b183fd4f63b68c026d"} Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.529054 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kb7z9" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.529108 4891 scope.go:117] "RemoveContainer" containerID="11d02efd4b91519f57a0828c04b345f86fcb0f8b36b5fdf2e95bfed3c64949d0" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.531992 4891 generic.go:334] "Generic (PLEG): container finished" podID="7e83cc3b-a436-4846-9246-1e1dec8e85cc" containerID="abf731f434a664062034377e3969182518ca28e110d4516277c631c1140b6297" exitCode=0 Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.532078 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bbjd" event={"ID":"7e83cc3b-a436-4846-9246-1e1dec8e85cc","Type":"ContainerDied","Data":"abf731f434a664062034377e3969182518ca28e110d4516277c631c1140b6297"} Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.594267 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.599281 4891 scope.go:117] "RemoveContainer" containerID="6b5bf3146316cd7c22d46d1eac96abb94b2b092b396e9815cee8daabd97e6b16" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.599843 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kb7z9"] Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.945062 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:12 crc kubenswrapper[4891]: I0929 10:05:12.953080 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.115280 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46c9w\" (UniqueName: \"kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w\") pod \"9f670344-0a7d-4c50-ad4d-e00195b3f232\" (UID: \"9f670344-0a7d-4c50-ad4d-e00195b3f232\") " Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.115411 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjsgs\" (UniqueName: \"kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs\") pod \"7f582bd0-7e30-4fef-8623-9d0482c4aa7b\" (UID: \"7f582bd0-7e30-4fef-8623-9d0482c4aa7b\") " Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.121439 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w" (OuterVolumeSpecName: "kube-api-access-46c9w") pod "9f670344-0a7d-4c50-ad4d-e00195b3f232" (UID: "9f670344-0a7d-4c50-ad4d-e00195b3f232"). InnerVolumeSpecName "kube-api-access-46c9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.122900 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs" (OuterVolumeSpecName: "kube-api-access-tjsgs") pod "7f582bd0-7e30-4fef-8623-9d0482c4aa7b" (UID: "7f582bd0-7e30-4fef-8623-9d0482c4aa7b"). InnerVolumeSpecName "kube-api-access-tjsgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.217337 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46c9w\" (UniqueName: \"kubernetes.io/projected/9f670344-0a7d-4c50-ad4d-e00195b3f232-kube-api-access-46c9w\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.217388 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjsgs\" (UniqueName: \"kubernetes.io/projected/7f582bd0-7e30-4fef-8623-9d0482c4aa7b-kube-api-access-tjsgs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.549984 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ccba-account-create-cdq25" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.550068 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ccba-account-create-cdq25" event={"ID":"9f670344-0a7d-4c50-ad4d-e00195b3f232","Type":"ContainerDied","Data":"8d63e1fce488dd69e16c6185712ef38f8d5ff88e3d5c6b24ab7f87c5a2096394"} Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.550540 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d63e1fce488dd69e16c6185712ef38f8d5ff88e3d5c6b24ab7f87c5a2096394" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.552418 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8a72-account-create-wqxzw" event={"ID":"7f582bd0-7e30-4fef-8623-9d0482c4aa7b","Type":"ContainerDied","Data":"33dd60219716ee4972419c387693fd10c307aa1e9f1d7e91e72db6e34cea1a1e"} Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.552485 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33dd60219716ee4972419c387693fd10c307aa1e9f1d7e91e72db6e34cea1a1e" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.552452 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8a72-account-create-wqxzw" Sep 29 10:05:13 crc kubenswrapper[4891]: I0929 10:05:13.905345 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.032616 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg27r\" (UniqueName: \"kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r\") pod \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.032885 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data\") pod \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.032957 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle\") pod \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\" (UID: \"7e83cc3b-a436-4846-9246-1e1dec8e85cc\") " Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.039433 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r" (OuterVolumeSpecName: "kube-api-access-zg27r") pod "7e83cc3b-a436-4846-9246-1e1dec8e85cc" (UID: "7e83cc3b-a436-4846-9246-1e1dec8e85cc"). InnerVolumeSpecName "kube-api-access-zg27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.060160 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e83cc3b-a436-4846-9246-1e1dec8e85cc" (UID: "7e83cc3b-a436-4846-9246-1e1dec8e85cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.102229 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data" (OuterVolumeSpecName: "config-data") pod "7e83cc3b-a436-4846-9246-1e1dec8e85cc" (UID: "7e83cc3b-a436-4846-9246-1e1dec8e85cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.135420 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.135590 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e83cc3b-a436-4846-9246-1e1dec8e85cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.135699 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg27r\" (UniqueName: \"kubernetes.io/projected/7e83cc3b-a436-4846-9246-1e1dec8e85cc-kube-api-access-zg27r\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.411303 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" path="/var/lib/kubelet/pods/e8960beb-b8cf-4c19-9ae8-1a8ce3f52897/volumes" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.562717 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bbjd" event={"ID":"7e83cc3b-a436-4846-9246-1e1dec8e85cc","Type":"ContainerDied","Data":"d7624ace97852f907edec6d21489259f92c724e4fd4f4bef94bd1b92cc44940e"} Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.562769 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7624ace97852f907edec6d21489259f92c724e4fd4f4bef94bd1b92cc44940e" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.562781 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bbjd" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.909442 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:14 crc kubenswrapper[4891]: E0929 10:05:14.910651 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="dnsmasq-dns" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910676 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="dnsmasq-dns" Sep 29 10:05:14 crc kubenswrapper[4891]: E0929 10:05:14.910701 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f670344-0a7d-4c50-ad4d-e00195b3f232" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910709 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f670344-0a7d-4c50-ad4d-e00195b3f232" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: E0929 10:05:14.910724 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e83cc3b-a436-4846-9246-1e1dec8e85cc" containerName="keystone-db-sync" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910731 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e83cc3b-a436-4846-9246-1e1dec8e85cc" containerName="keystone-db-sync" Sep 29 10:05:14 crc kubenswrapper[4891]: E0929 10:05:14.910739 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="init" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910746 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="init" Sep 29 10:05:14 crc kubenswrapper[4891]: E0929 10:05:14.910754 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f582bd0-7e30-4fef-8623-9d0482c4aa7b" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910760 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f582bd0-7e30-4fef-8623-9d0482c4aa7b" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.910998 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f582bd0-7e30-4fef-8623-9d0482c4aa7b" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.911026 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f670344-0a7d-4c50-ad4d-e00195b3f232" containerName="mariadb-account-create" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.911042 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8960beb-b8cf-4c19-9ae8-1a8ce3f52897" containerName="dnsmasq-dns" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.911067 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e83cc3b-a436-4846-9246-1e1dec8e85cc" containerName="keystone-db-sync" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.915594 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.938248 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.995718 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cncns"] Sep 29 10:05:14 crc kubenswrapper[4891]: I0929 10:05:14.997144 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.005828 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-trmdq" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.006041 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.006163 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.006686 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.015197 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cncns"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.054871 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.054944 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.055034 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tt76\" (UniqueName: \"kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.055060 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.055084 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.055133 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.134202 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.137625 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.153701 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.153973 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.154155 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gm4nl" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.154329 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161246 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161312 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tt76\" (UniqueName: \"kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161333 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72hf\" (UniqueName: \"kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161353 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161371 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161390 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161406 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161424 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161440 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161457 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161488 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161501 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161543 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161566 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161599 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161616 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkb5r\" (UniqueName: \"kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.161635 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.163770 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.164596 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.166008 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.168618 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.168759 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.169167 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.240313 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tt76\" (UniqueName: \"kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76\") pod \"dnsmasq-dns-6c9c9f998c-cxpf2\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.243430 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.270911 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271312 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271421 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271510 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271588 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271746 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.271871 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.272064 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.272227 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkb5r\" (UniqueName: \"kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.272433 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.272581 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72hf\" (UniqueName: \"kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.273234 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.282556 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.294989 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.295662 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.298844 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.373860 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.390835 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkb5r\" (UniqueName: \"kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.433663 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72hf\" (UniqueName: \"kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf\") pod \"horizon-c9454dcd5-bk2r4\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.443257 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.443721 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.449396 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle\") pod \"keystone-bootstrap-cncns\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.493624 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.537772 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.539400 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.603464 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.635629 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.636509 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jv6jw"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.637911 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.642926 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.646531 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bm7bg"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.648923 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.661350 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfx6v" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.661602 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.661955 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-g9w5p" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.665221 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.665465 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.665591 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.680881 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jv6jw"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.685588 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4rk\" (UniqueName: \"kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.685675 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.685718 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.685745 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.685813 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.705892 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bm7bg"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.723193 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.728635 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.735269 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.735472 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.739562 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.741396 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.743709 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-s4bk8"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.749884 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.758628 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.758662 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmv66" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787539 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787588 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787625 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787642 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787665 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787681 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787702 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787744 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfr6b\" (UniqueName: \"kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787765 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787799 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47wk5\" (UniqueName: \"kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787834 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4rk\" (UniqueName: \"kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787863 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.787884 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.788968 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.792145 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.795520 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.796534 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.808154 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.814548 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4rk\" (UniqueName: \"kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk\") pod \"horizon-7cbfd6f48f-lw2wd\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.871162 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-s4bk8"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.877751 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.889918 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890040 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890157 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890189 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890219 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890251 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890285 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890341 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkzx\" (UniqueName: \"kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.890428 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.892096 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894063 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894117 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894142 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4sj8\" (UniqueName: \"kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894187 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894231 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894624 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894672 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrttv\" (UniqueName: \"kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894697 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894720 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfr6b\" (UniqueName: \"kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894806 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47wk5\" (UniqueName: \"kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894835 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894857 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894909 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894948 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894979 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.894996 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.896061 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.896498 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.898086 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cck6l" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.898352 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.898497 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.898628 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.909914 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.914910 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.918394 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.923506 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.925222 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.926079 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.926656 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47wk5\" (UniqueName: \"kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5\") pod \"neutron-db-sync-bm7bg\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.937456 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfr6b\" (UniqueName: \"kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b\") pod \"placement-db-sync-jv6jw\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996287 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv6jw" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996814 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996883 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996916 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996936 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996963 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.996987 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997024 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997048 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997075 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997103 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997121 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkzx\" (UniqueName: \"kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997159 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997176 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997201 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4sj8\" (UniqueName: \"kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997250 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997269 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8cf\" (UniqueName: \"kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997291 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997314 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrttv\" (UniqueName: \"kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997333 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997353 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997373 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997407 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997426 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997451 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.997849 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:15 crc kubenswrapper[4891]: I0929 10:05:15.999099 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.000116 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.001402 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.001493 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.001679 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.004823 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.006144 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.008603 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.009067 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.010848 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.013251 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.018651 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.029128 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4sj8\" (UniqueName: \"kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8\") pod \"barbican-db-sync-s4bk8\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.030496 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.037775 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkzx\" (UniqueName: \"kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx\") pod \"ceilometer-0\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.040653 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrttv\" (UniqueName: \"kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv\") pod \"dnsmasq-dns-57c957c4ff-tkdcz\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.079099 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.105558 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.108556 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.109353 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8cf\" (UniqueName: \"kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.109519 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.109547 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.110263 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.110326 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.110445 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.110519 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.110724 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.114547 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.115095 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.115818 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.128444 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.129318 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.138173 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.138288 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.148713 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8cf\" (UniqueName: \"kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.153737 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:05:16 crc kubenswrapper[4891]: W0929 10:05:16.154442 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d83f36a_73dd_4a94_9432_7e02ed30a437.slice/crio-3f324416befa0ed851773143ed267b5c721affc6ac93b098b9d7c069ae8c7620 WatchSource:0}: Error finding container 3f324416befa0ed851773143ed267b5c721affc6ac93b098b9d7c069ae8c7620: Status 404 returned error can't find the container with id 3f324416befa0ed851773143ed267b5c721affc6ac93b098b9d7c069ae8c7620 Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.174474 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.238062 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.267329 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cncns"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.273491 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.372479 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.429680 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.438049 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.438122 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.471972 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.501431 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.545226 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.545747 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.545819 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.545907 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.545951 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.546052 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmmx\" (UniqueName: \"kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.546091 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.546120 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.632119 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" event={"ID":"3d83f36a-73dd-4a94-9432-7e02ed30a437","Type":"ContainerStarted","Data":"3f324416befa0ed851773143ed267b5c721affc6ac93b098b9d7c069ae8c7620"} Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.633837 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cbfd6f48f-lw2wd" event={"ID":"47e58639-672a-4281-b7da-5363647cb329","Type":"ContainerStarted","Data":"ec7be4746ff750d55e4d22ac749abd9d305a1eecfef6845d2141e0c3a0c2cc60"} Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.639146 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9454dcd5-bk2r4" event={"ID":"f13e18bf-c8d3-4e58-ada2-b3014689271e","Type":"ContainerStarted","Data":"66bc133c77c0c9128ee13ff624e0a0aab090fa0c9eb8f543f117ec861f00a064"} Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.639563 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jv6jw"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.644284 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cncns" event={"ID":"943687e2-eefb-46b9-8595-224dc883d780","Type":"ContainerStarted","Data":"a8754d2ee09636ee9399a996ea319e7f1b528484ad1c2603c72ca25915389139"} Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648138 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648229 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648313 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648340 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648429 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmmx\" (UniqueName: \"kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648473 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648521 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648604 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.648974 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.649533 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.654843 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.657641 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.658275 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.665604 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.665656 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.680230 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmmx\" (UniqueName: \"kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.730105 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.791126 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.828825 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bm7bg"] Sep 29 10:05:16 crc kubenswrapper[4891]: I0929 10:05:16.980280 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:16.998131 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-s4bk8"] Sep 29 10:05:17 crc kubenswrapper[4891]: W0929 10:05:16.998374 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5872fd60_b8c9_4f00_8c9a_679960a32e27.slice/crio-908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674 WatchSource:0}: Error finding container 908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674: Status 404 returned error can't find the container with id 908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674 Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.017317 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: W0929 10:05:17.023513 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c6c4b8_5f24_48db_884c_dd0669cb67cc.slice/crio-fd50780ba2d96f0284e2a6870e4918789821db24f6d1467c53bccdd0d21b4b15 WatchSource:0}: Error finding container fd50780ba2d96f0284e2a6870e4918789821db24f6d1467c53bccdd0d21b4b15: Status 404 returned error can't find the container with id fd50780ba2d96f0284e2a6870e4918789821db24f6d1467c53bccdd0d21b4b15 Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.233685 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.424044 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: W0929 10:05:17.437493 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00983c03_0370_4feb_a9aa_c3d6cd63c49d.slice/crio-cd72b4f5ae38bcbd8c4e401f970dc04e041043a4d22e848fd0d7b7925fc47239 WatchSource:0}: Error finding container cd72b4f5ae38bcbd8c4e401f970dc04e041043a4d22e848fd0d7b7925fc47239: Status 404 returned error can't find the container with id cd72b4f5ae38bcbd8c4e401f970dc04e041043a4d22e848fd0d7b7925fc47239 Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.660499 4891 generic.go:334] "Generic (PLEG): container finished" podID="3d83f36a-73dd-4a94-9432-7e02ed30a437" containerID="74badbd7c337b5b4e1171610c950d908e2426dc39f281d6ed70b44c85cadfc38" exitCode=0 Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.660589 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" event={"ID":"3d83f36a-73dd-4a94-9432-7e02ed30a437","Type":"ContainerDied","Data":"74badbd7c337b5b4e1171610c950d908e2426dc39f281d6ed70b44c85cadfc38"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.665650 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bm7bg" event={"ID":"e641569b-322f-4157-aaf2-44d5f700234d","Type":"ContainerStarted","Data":"d53bf5890f49da04046f8b0e6e3e76b26c13f583a8914579e66d0cb000c17d7a"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.665698 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bm7bg" event={"ID":"e641569b-322f-4157-aaf2-44d5f700234d","Type":"ContainerStarted","Data":"834e8e0d50f9e1347cd44f91cd13b97e6442ded6c0c8beaa570743074308aee1"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.669080 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cncns" event={"ID":"943687e2-eefb-46b9-8595-224dc883d780","Type":"ContainerStarted","Data":"94733727fc21851a7beb9ab47c467ae767e5719fb96d08dd9b66466a29d4c5d5"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.671034 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerStarted","Data":"cd72b4f5ae38bcbd8c4e401f970dc04e041043a4d22e848fd0d7b7925fc47239"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.679025 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerStarted","Data":"dbe29a6ee2a91c5ed3e8ce9fb8e5eb719dd98da926efc4007aa2b06c491a541a"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.684389 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s4bk8" event={"ID":"5872fd60-b8c9-4f00-8c9a-679960a32e27","Type":"ContainerStarted","Data":"908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.714281 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerStarted","Data":"fd50780ba2d96f0284e2a6870e4918789821db24f6d1467c53bccdd0d21b4b15"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.715724 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bm7bg" podStartSLOduration=2.715699268 podStartE2EDuration="2.715699268s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:17.709610048 +0000 UTC m=+1047.914778369" watchObservedRunningTime="2025-09-29 10:05:17.715699268 +0000 UTC m=+1047.920867589" Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.716745 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv6jw" event={"ID":"fb88c2dd-0bb3-4425-842f-b697d51f8273","Type":"ContainerStarted","Data":"edba210b63d0d5b6fde69a866c8df40d894a08465365b4c5e2e97a658901bf0e"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.732057 4891 generic.go:334] "Generic (PLEG): container finished" podID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerID="4d2e676f7926ea0e511b8a90a42809a09df26ec24c0e6b7ff34702547496b1c8" exitCode=0 Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.732111 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" event={"ID":"048fe6e5-9e83-456c-965f-d4b0a7378b02","Type":"ContainerDied","Data":"4d2e676f7926ea0e511b8a90a42809a09df26ec24c0e6b7ff34702547496b1c8"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.732154 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" event={"ID":"048fe6e5-9e83-456c-965f-d4b0a7378b02","Type":"ContainerStarted","Data":"75f25d90bb1ff838a5163ffe9ef33db46cd7331529fe7fc92e95e16b45715f6a"} Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.754456 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cncns" podStartSLOduration=3.754427223 podStartE2EDuration="3.754427223s" podCreationTimestamp="2025-09-29 10:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:17.727379993 +0000 UTC m=+1047.932548324" watchObservedRunningTime="2025-09-29 10:05:17.754427223 +0000 UTC m=+1047.959595544" Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.803684 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.852108 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.864120 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.924547 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.952295 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.954034 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:17 crc kubenswrapper[4891]: I0929 10:05:17.954130 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.095101 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.095470 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.095706 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.095770 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.095945 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.169817 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.207870 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.208451 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.208645 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.208782 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.208817 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.209826 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.210316 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.211587 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.218330 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.236801 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp\") pod \"horizon-86c8665cc-jf4tb\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.310991 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.311106 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.311175 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.311193 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.311250 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tt76\" (UniqueName: \"kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.311274 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb\") pod \"3d83f36a-73dd-4a94-9432-7e02ed30a437\" (UID: \"3d83f36a-73dd-4a94-9432-7e02ed30a437\") " Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.324656 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.347324 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76" (OuterVolumeSpecName: "kube-api-access-8tt76") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "kube-api-access-8tt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.408194 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.414134 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.414159 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tt76\" (UniqueName: \"kubernetes.io/projected/3d83f36a-73dd-4a94-9432-7e02ed30a437-kube-api-access-8tt76\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.423181 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config" (OuterVolumeSpecName: "config") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.480036 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.486317 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.490282 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d83f36a-73dd-4a94-9432-7e02ed30a437" (UID: "3d83f36a-73dd-4a94-9432-7e02ed30a437"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.518056 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.518088 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.518097 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.518107 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d83f36a-73dd-4a94-9432-7e02ed30a437-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.772033 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerStarted","Data":"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a"} Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.774287 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" event={"ID":"3d83f36a-73dd-4a94-9432-7e02ed30a437","Type":"ContainerDied","Data":"3f324416befa0ed851773143ed267b5c721affc6ac93b098b9d7c069ae8c7620"} Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.774337 4891 scope.go:117] "RemoveContainer" containerID="74badbd7c337b5b4e1171610c950d908e2426dc39f281d6ed70b44c85cadfc38" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.774427 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-cxpf2" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.779358 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" event={"ID":"048fe6e5-9e83-456c-965f-d4b0a7378b02","Type":"ContainerStarted","Data":"1d3f9e19f41504fccd8bb715da857cc470817c51340a28de447051e0e4a720f7"} Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.779405 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.799084 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" podStartSLOduration=3.799057591 podStartE2EDuration="3.799057591s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:18.798427672 +0000 UTC m=+1049.003595993" watchObservedRunningTime="2025-09-29 10:05:18.799057591 +0000 UTC m=+1049.004225912" Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.888602 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.903940 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-cxpf2"] Sep 29 10:05:18 crc kubenswrapper[4891]: I0929 10:05:18.916578 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.714452 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8536-account-create-sgsw4"] Sep 29 10:05:19 crc kubenswrapper[4891]: E0929 10:05:19.715312 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d83f36a-73dd-4a94-9432-7e02ed30a437" containerName="init" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.715334 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d83f36a-73dd-4a94-9432-7e02ed30a437" containerName="init" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.715565 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d83f36a-73dd-4a94-9432-7e02ed30a437" containerName="init" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.716463 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.719561 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.737411 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8536-account-create-sgsw4"] Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.801596 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c8665cc-jf4tb" event={"ID":"22592ee5-954e-4360-89b6-8f45892eb270","Type":"ContainerStarted","Data":"d7b877397347f964bf750bff0d74b8cee4f3c86f63570ac67ef6984217b71bb2"} Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.803879 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerStarted","Data":"4772e9da12668efa5f475d6eeb27a333d78ca81c29463d6129782f039ea2af55"} Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.809855 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerStarted","Data":"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc"} Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.809957 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-log" containerID="cri-o://2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" gracePeriod=30 Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.810406 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-httpd" containerID="cri-o://6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" gracePeriod=30 Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.840565 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.840541636 podStartE2EDuration="4.840541636s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:19.832842208 +0000 UTC m=+1050.038010529" watchObservedRunningTime="2025-09-29 10:05:19.840541636 +0000 UTC m=+1050.045709957" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.860408 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67xhc\" (UniqueName: \"kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc\") pod \"cinder-8536-account-create-sgsw4\" (UID: \"fa4026f9-ee20-44aa-9575-b1e64680139a\") " pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.962889 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67xhc\" (UniqueName: \"kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc\") pod \"cinder-8536-account-create-sgsw4\" (UID: \"fa4026f9-ee20-44aa-9575-b1e64680139a\") " pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:19 crc kubenswrapper[4891]: I0929 10:05:19.992752 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67xhc\" (UniqueName: \"kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc\") pod \"cinder-8536-account-create-sgsw4\" (UID: \"fa4026f9-ee20-44aa-9575-b1e64680139a\") " pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.078512 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.420916 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d83f36a-73dd-4a94-9432-7e02ed30a437" path="/var/lib/kubelet/pods/3d83f36a-73dd-4a94-9432-7e02ed30a437/volumes" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.611703 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8536-account-create-sgsw4"] Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.825578 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.842162 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerStarted","Data":"cdb0b1b8c07ad6f8cab1bc1ccc586e88c894142d4d2cd29eea693dae98b94df1"} Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.842205 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-log" containerID="cri-o://4772e9da12668efa5f475d6eeb27a333d78ca81c29463d6129782f039ea2af55" gracePeriod=30 Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.842305 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-httpd" containerID="cri-o://cdb0b1b8c07ad6f8cab1bc1ccc586e88c894142d4d2cd29eea693dae98b94df1" gracePeriod=30 Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.863986 4891 generic.go:334] "Generic (PLEG): container finished" podID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerID="6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" exitCode=0 Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864019 4891 generic.go:334] "Generic (PLEG): container finished" podID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerID="2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" exitCode=143 Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864064 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerDied","Data":"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc"} Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864102 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerDied","Data":"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a"} Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864112 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce8814b4-68a4-437a-b1ae-8c368895cd8d","Type":"ContainerDied","Data":"dbe29a6ee2a91c5ed3e8ce9fb8e5eb719dd98da926efc4007aa2b06c491a541a"} Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864130 4891 scope.go:117] "RemoveContainer" containerID="6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.864255 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.865637 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8536-account-create-sgsw4" event={"ID":"fa4026f9-ee20-44aa-9575-b1e64680139a","Type":"ContainerStarted","Data":"aa351c5a93b858f54bfeac70bec779c34890e61d64676a2656e9ffde8a5a0305"} Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900429 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900512 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900544 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900663 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900729 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900760 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8cf\" (UniqueName: \"kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900864 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.900926 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\" (UID: \"ce8814b4-68a4-437a-b1ae-8c368895cd8d\") " Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.901313 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.901756 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.902149 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs" (OuterVolumeSpecName: "logs") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.907312 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.907284746 podStartE2EDuration="5.907284746s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:20.898570008 +0000 UTC m=+1051.103738349" watchObservedRunningTime="2025-09-29 10:05:20.907284746 +0000 UTC m=+1051.112453077" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.908606 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.911668 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts" (OuterVolumeSpecName: "scripts") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.924618 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf" (OuterVolumeSpecName: "kube-api-access-qr8cf") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "kube-api-access-qr8cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:20 crc kubenswrapper[4891]: I0929 10:05:20.994507 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:20.999548 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data" (OuterVolumeSpecName: "config-data") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.004940 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.004982 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.004999 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce8814b4-68a4-437a-b1ae-8c368895cd8d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.005011 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.005023 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8cf\" (UniqueName: \"kubernetes.io/projected/ce8814b4-68a4-437a-b1ae-8c368895cd8d-kube-api-access-qr8cf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.005043 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.019596 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce8814b4-68a4-437a-b1ae-8c368895cd8d" (UID: "ce8814b4-68a4-437a-b1ae-8c368895cd8d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.033641 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.106558 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.107317 4891 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce8814b4-68a4-437a-b1ae-8c368895cd8d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.208175 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.223297 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.244581 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:21 crc kubenswrapper[4891]: E0929 10:05:21.245075 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-httpd" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.245096 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-httpd" Sep 29 10:05:21 crc kubenswrapper[4891]: E0929 10:05:21.245131 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-log" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.245139 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-log" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.245311 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-httpd" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.245331 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" containerName="glance-log" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.246380 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.248454 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.250517 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.256079 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.416095 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.416512 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4nlr\" (UniqueName: \"kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.416592 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.416676 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.416847 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.417113 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.417186 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.417272 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518668 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518807 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518854 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518921 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518953 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.518980 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4nlr\" (UniqueName: \"kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.519063 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.519745 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.519834 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.519935 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.524268 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.524441 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.526909 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.527524 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.543521 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4nlr\" (UniqueName: \"kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.551699 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.613039 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.882490 4891 generic.go:334] "Generic (PLEG): container finished" podID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerID="cdb0b1b8c07ad6f8cab1bc1ccc586e88c894142d4d2cd29eea693dae98b94df1" exitCode=0 Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.882534 4891 generic.go:334] "Generic (PLEG): container finished" podID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerID="4772e9da12668efa5f475d6eeb27a333d78ca81c29463d6129782f039ea2af55" exitCode=143 Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.882579 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerDied","Data":"cdb0b1b8c07ad6f8cab1bc1ccc586e88c894142d4d2cd29eea693dae98b94df1"} Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.882634 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerDied","Data":"4772e9da12668efa5f475d6eeb27a333d78ca81c29463d6129782f039ea2af55"} Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.884844 4891 generic.go:334] "Generic (PLEG): container finished" podID="943687e2-eefb-46b9-8595-224dc883d780" containerID="94733727fc21851a7beb9ab47c467ae767e5719fb96d08dd9b66466a29d4c5d5" exitCode=0 Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.884906 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cncns" event={"ID":"943687e2-eefb-46b9-8595-224dc883d780","Type":"ContainerDied","Data":"94733727fc21851a7beb9ab47c467ae767e5719fb96d08dd9b66466a29d4c5d5"} Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.886982 4891 generic.go:334] "Generic (PLEG): container finished" podID="fa4026f9-ee20-44aa-9575-b1e64680139a" containerID="05976b9244536aaf671ae008bfa47e8488858cf9bd923da3799ab7987da88fc2" exitCode=0 Sep 29 10:05:21 crc kubenswrapper[4891]: I0929 10:05:21.887007 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8536-account-create-sgsw4" event={"ID":"fa4026f9-ee20-44aa-9575-b1e64680139a","Type":"ContainerDied","Data":"05976b9244536aaf671ae008bfa47e8488858cf9bd923da3799ab7987da88fc2"} Sep 29 10:05:22 crc kubenswrapper[4891]: I0929 10:05:22.416538 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8814b4-68a4-437a-b1ae-8c368895cd8d" path="/var/lib/kubelet/pods/ce8814b4-68a4-437a-b1ae-8c368895cd8d/volumes" Sep 29 10:05:23 crc kubenswrapper[4891]: I0929 10:05:23.837513 4891 scope.go:117] "RemoveContainer" containerID="2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.198802 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.241972 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.244087 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.246473 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.272084 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295271 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295319 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22cz4\" (UniqueName: \"kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295409 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295446 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295519 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295567 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.295741 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.300941 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.333095 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.360105 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d8cd8ff44-d8rc8"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.361882 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396630 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396708 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396733 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d464aff7-6448-4eaf-b88e-01a8acc3e42a-logs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396756 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-config-data\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396820 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-scripts\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396854 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396873 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22cz4\" (UniqueName: \"kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396903 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-secret-key\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396922 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-combined-ca-bundle\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396956 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-tls-certs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.396976 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.397002 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.397020 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbpwd\" (UniqueName: \"kubernetes.io/projected/d464aff7-6448-4eaf-b88e-01a8acc3e42a-kube-api-access-qbpwd\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.397045 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.399747 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.406437 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.407641 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.411395 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.411461 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.424343 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d8cd8ff44-d8rc8"] Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.425499 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.426885 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22cz4\" (UniqueName: \"kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4\") pod \"horizon-7975d54bd8-pl4st\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.517850 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-secret-key\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518014 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-combined-ca-bundle\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518146 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-tls-certs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518263 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbpwd\" (UniqueName: \"kubernetes.io/projected/d464aff7-6448-4eaf-b88e-01a8acc3e42a-kube-api-access-qbpwd\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518425 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d464aff7-6448-4eaf-b88e-01a8acc3e42a-logs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518533 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-config-data\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.518686 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-scripts\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.520407 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d464aff7-6448-4eaf-b88e-01a8acc3e42a-logs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.526489 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-tls-certs\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.527075 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-config-data\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.527188 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d464aff7-6448-4eaf-b88e-01a8acc3e42a-scripts\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.527399 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-horizon-secret-key\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.537706 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d464aff7-6448-4eaf-b88e-01a8acc3e42a-combined-ca-bundle\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.558558 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbpwd\" (UniqueName: \"kubernetes.io/projected/d464aff7-6448-4eaf-b88e-01a8acc3e42a-kube-api-access-qbpwd\") pod \"horizon-6d8cd8ff44-d8rc8\" (UID: \"d464aff7-6448-4eaf-b88e-01a8acc3e42a\") " pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.579345 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:24 crc kubenswrapper[4891]: I0929 10:05:24.683544 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:26 crc kubenswrapper[4891]: I0929 10:05:26.108060 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:05:26 crc kubenswrapper[4891]: I0929 10:05:26.191225 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:05:26 crc kubenswrapper[4891]: I0929 10:05:26.191505 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="dnsmasq-dns" containerID="cri-o://ebcfbfb0783dfbcba0948383f93b72cc28f36b28cf9522ee0cba3a18c02b544b" gracePeriod=10 Sep 29 10:05:26 crc kubenswrapper[4891]: I0929 10:05:26.957238 4891 generic.go:334] "Generic (PLEG): container finished" podID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerID="ebcfbfb0783dfbcba0948383f93b72cc28f36b28cf9522ee0cba3a18c02b544b" exitCode=0 Sep 29 10:05:26 crc kubenswrapper[4891]: I0929 10:05:26.957288 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" event={"ID":"a59c7eb8-6677-4725-8a96-6920e1b84c83","Type":"ContainerDied","Data":"ebcfbfb0783dfbcba0948383f93b72cc28f36b28cf9522ee0cba3a18c02b544b"} Sep 29 10:05:30 crc kubenswrapper[4891]: I0929 10:05:30.258263 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Sep 29 10:05:32 crc kubenswrapper[4891]: E0929 10:05:32.848821 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:05:32 crc kubenswrapper[4891]: E0929 10:05:32.849737 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h649hf8h587h56fh5bdh5b7h5d4h567h7dh5d8h5b9h664h54ch55dh68ch55fh547h5h9dh58fhcdh5b5h547h64ch88h55bh696h5d7h544h56fh5d4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6z4rk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7cbfd6f48f-lw2wd_openstack(47e58639-672a-4281-b7da-5363647cb329): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:05:32 crc kubenswrapper[4891]: E0929 10:05:32.852327 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7cbfd6f48f-lw2wd" podUID="47e58639-672a-4281-b7da-5363647cb329" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.421694 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.422341 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cfr6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-jv6jw_openstack(fb88c2dd-0bb3-4425-842f-b697d51f8273): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.424186 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-jv6jw" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.452401 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.452644 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbh58fh688h55bhd8h585h5cch5fch67dh7h54bh7hb7h5bch5cdh565hcch659h5c5h577hf6h698h688h549h694hf6h6fh9dh576h684h6dh647q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v72hf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-c9454dcd5-bk2r4_openstack(f13e18bf-c8d3-4e58-ada2-b3014689271e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.461650 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-c9454dcd5-bk2r4" podUID="f13e18bf-c8d3-4e58-ada2-b3014689271e" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.462351 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.462784 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n594h8chc9h9bh59ch64dh87h74h687h659h5c5h688h56h56ch689hf7hcfh567hf9h689hd4h5bfh589hffh544h67fh7ch649h5cbh545h685hfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mh9kp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86c8665cc-jf4tb_openstack(22592ee5-954e-4360-89b6-8f45892eb270): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:05:34 crc kubenswrapper[4891]: E0929 10:05:34.468122 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-86c8665cc-jf4tb" podUID="22592ee5-954e-4360-89b6-8f45892eb270" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.530386 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.549702 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.580836 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67xhc\" (UniqueName: \"kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc\") pod \"fa4026f9-ee20-44aa-9575-b1e64680139a\" (UID: \"fa4026f9-ee20-44aa-9575-b1e64680139a\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.580953 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.581056 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.581133 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.581258 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkb5r\" (UniqueName: \"kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.581285 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.581430 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys\") pod \"943687e2-eefb-46b9-8595-224dc883d780\" (UID: \"943687e2-eefb-46b9-8595-224dc883d780\") " Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.590006 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r" (OuterVolumeSpecName: "kube-api-access-lkb5r") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "kube-api-access-lkb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.590490 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.605036 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.614147 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts" (OuterVolumeSpecName: "scripts") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.619907 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc" (OuterVolumeSpecName: "kube-api-access-67xhc") pod "fa4026f9-ee20-44aa-9575-b1e64680139a" (UID: "fa4026f9-ee20-44aa-9575-b1e64680139a"). InnerVolumeSpecName "kube-api-access-67xhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.636925 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data" (OuterVolumeSpecName: "config-data") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.640071 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943687e2-eefb-46b9-8595-224dc883d780" (UID: "943687e2-eefb-46b9-8595-224dc883d780"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686332 4891 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686373 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67xhc\" (UniqueName: \"kubernetes.io/projected/fa4026f9-ee20-44aa-9575-b1e64680139a-kube-api-access-67xhc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686385 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686396 4891 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686405 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686413 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkb5r\" (UniqueName: \"kubernetes.io/projected/943687e2-eefb-46b9-8595-224dc883d780-kube-api-access-lkb5r\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:34 crc kubenswrapper[4891]: I0929 10:05:34.686421 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/943687e2-eefb-46b9-8595-224dc883d780-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.034632 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cncns" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.034652 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cncns" event={"ID":"943687e2-eefb-46b9-8595-224dc883d780","Type":"ContainerDied","Data":"a8754d2ee09636ee9399a996ea319e7f1b528484ad1c2603c72ca25915389139"} Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.034719 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8754d2ee09636ee9399a996ea319e7f1b528484ad1c2603c72ca25915389139" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.036333 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8536-account-create-sgsw4" event={"ID":"fa4026f9-ee20-44aa-9575-b1e64680139a","Type":"ContainerDied","Data":"aa351c5a93b858f54bfeac70bec779c34890e61d64676a2656e9ffde8a5a0305"} Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.036402 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa351c5a93b858f54bfeac70bec779c34890e61d64676a2656e9ffde8a5a0305" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.036507 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8536-account-create-sgsw4" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.043090 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.043355 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4sj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-s4bk8_openstack(5872fd60-b8c9-4f00-8c9a-679960a32e27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.044586 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-s4bk8" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.063416 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-jv6jw" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.137383 4891 scope.go:117] "RemoveContainer" containerID="6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.139901 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc\": container with ID starting with 6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc not found: ID does not exist" containerID="6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.139953 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc"} err="failed to get container status \"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc\": rpc error: code = NotFound desc = could not find container \"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc\": container with ID starting with 6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc not found: ID does not exist" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.139984 4891 scope.go:117] "RemoveContainer" containerID="2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.141661 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a\": container with ID starting with 2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a not found: ID does not exist" containerID="2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.141719 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a"} err="failed to get container status \"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a\": rpc error: code = NotFound desc = could not find container \"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a\": container with ID starting with 2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a not found: ID does not exist" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.141756 4891 scope.go:117] "RemoveContainer" containerID="6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.144094 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc"} err="failed to get container status \"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc\": rpc error: code = NotFound desc = could not find container \"6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc\": container with ID starting with 6a708a123a13f1ce1b492103beddccffd979da3d837eebd372eaebfb64fa5afc not found: ID does not exist" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.144130 4891 scope.go:117] "RemoveContainer" containerID="2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.145713 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a"} err="failed to get container status \"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a\": rpc error: code = NotFound desc = could not find container \"2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a\": container with ID starting with 2b32950155a1691de053cab655ac476695466b0bc80774a1abcf4ea26860e63a not found: ID does not exist" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.147928 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.151061 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.165582 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197577 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts\") pod \"47e58639-672a-4281-b7da-5363647cb329\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197645 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197688 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key\") pod \"47e58639-672a-4281-b7da-5363647cb329\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197714 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197763 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4rk\" (UniqueName: \"kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk\") pod \"47e58639-672a-4281-b7da-5363647cb329\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197842 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data\") pod \"47e58639-672a-4281-b7da-5363647cb329\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197884 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197944 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.197970 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198024 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz8v5\" (UniqueName: \"kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198063 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmmx\" (UniqueName: \"kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198106 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198142 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198179 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198217 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198246 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs\") pod \"47e58639-672a-4281-b7da-5363647cb329\" (UID: \"47e58639-672a-4281-b7da-5363647cb329\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198318 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198362 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle\") pod \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\" (UID: \"00983c03-0370-4feb-a9aa-c3d6cd63c49d\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.198395 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc\") pod \"a59c7eb8-6677-4725-8a96-6920e1b84c83\" (UID: \"a59c7eb8-6677-4725-8a96-6920e1b84c83\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.203198 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs" (OuterVolumeSpecName: "logs") pod "47e58639-672a-4281-b7da-5363647cb329" (UID: "47e58639-672a-4281-b7da-5363647cb329"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.204003 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts" (OuterVolumeSpecName: "scripts") pod "47e58639-672a-4281-b7da-5363647cb329" (UID: "47e58639-672a-4281-b7da-5363647cb329"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.207260 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data" (OuterVolumeSpecName: "config-data") pod "47e58639-672a-4281-b7da-5363647cb329" (UID: "47e58639-672a-4281-b7da-5363647cb329"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.207334 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.209361 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs" (OuterVolumeSpecName: "logs") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.216684 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5" (OuterVolumeSpecName: "kube-api-access-sz8v5") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "kube-api-access-sz8v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.219730 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk" (OuterVolumeSpecName: "kube-api-access-6z4rk") pod "47e58639-672a-4281-b7da-5363647cb329" (UID: "47e58639-672a-4281-b7da-5363647cb329"). InnerVolumeSpecName "kube-api-access-6z4rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.220029 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "47e58639-672a-4281-b7da-5363647cb329" (UID: "47e58639-672a-4281-b7da-5363647cb329"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.222674 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.227204 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts" (OuterVolumeSpecName: "scripts") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.243307 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx" (OuterVolumeSpecName: "kube-api-access-4vmmx") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "kube-api-access-4vmmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.279854 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305902 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305930 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305940 4891 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47e58639-672a-4281-b7da-5363647cb329-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305950 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4rk\" (UniqueName: \"kubernetes.io/projected/47e58639-672a-4281-b7da-5363647cb329-kube-api-access-6z4rk\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305961 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47e58639-672a-4281-b7da-5363647cb329-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305970 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305980 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz8v5\" (UniqueName: \"kubernetes.io/projected/a59c7eb8-6677-4725-8a96-6920e1b84c83-kube-api-access-sz8v5\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305989 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmmx\" (UniqueName: \"kubernetes.io/projected/00983c03-0370-4feb-a9aa-c3d6cd63c49d-kube-api-access-4vmmx\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.305997 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00983c03-0370-4feb-a9aa-c3d6cd63c49d-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.306013 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e58639-672a-4281-b7da-5363647cb329-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.306038 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.306049 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.340273 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.340272 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.362257 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config" (OuterVolumeSpecName: "config") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.382731 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.395912 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data" (OuterVolumeSpecName: "config-data") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.408443 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.408724 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a59c7eb8-6677-4725-8a96-6920e1b84c83" (UID: "a59c7eb8-6677-4725-8a96-6920e1b84c83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.408980 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409005 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409002 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "00983c03-0370-4feb-a9aa-c3d6cd63c49d" (UID: "00983c03-0370-4feb-a9aa-c3d6cd63c49d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409035 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409048 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409057 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.409066 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.510574 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59c7eb8-6677-4725-8a96-6920e1b84c83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.510621 4891 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00983c03-0370-4feb-a9aa-c3d6cd63c49d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.596763 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.616568 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.687447 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cncns"] Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.700482 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cncns"] Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717349 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data\") pod \"f13e18bf-c8d3-4e58-ada2-b3014689271e\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717579 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs\") pod \"22592ee5-954e-4360-89b6-8f45892eb270\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717644 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data\") pod \"22592ee5-954e-4360-89b6-8f45892eb270\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717676 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs\") pod \"f13e18bf-c8d3-4e58-ada2-b3014689271e\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717753 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp\") pod \"22592ee5-954e-4360-89b6-8f45892eb270\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717850 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key\") pod \"22592ee5-954e-4360-89b6-8f45892eb270\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717933 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts\") pod \"f13e18bf-c8d3-4e58-ada2-b3014689271e\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717968 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72hf\" (UniqueName: \"kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf\") pod \"f13e18bf-c8d3-4e58-ada2-b3014689271e\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.717989 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts\") pod \"22592ee5-954e-4360-89b6-8f45892eb270\" (UID: \"22592ee5-954e-4360-89b6-8f45892eb270\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.718015 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key\") pod \"f13e18bf-c8d3-4e58-ada2-b3014689271e\" (UID: \"f13e18bf-c8d3-4e58-ada2-b3014689271e\") " Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.718332 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs" (OuterVolumeSpecName: "logs") pod "22592ee5-954e-4360-89b6-8f45892eb270" (UID: "22592ee5-954e-4360-89b6-8f45892eb270"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.718484 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data" (OuterVolumeSpecName: "config-data") pod "22592ee5-954e-4360-89b6-8f45892eb270" (UID: "22592ee5-954e-4360-89b6-8f45892eb270"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.718536 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs" (OuterVolumeSpecName: "logs") pod "f13e18bf-c8d3-4e58-ada2-b3014689271e" (UID: "f13e18bf-c8d3-4e58-ada2-b3014689271e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.719059 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts" (OuterVolumeSpecName: "scripts") pod "22592ee5-954e-4360-89b6-8f45892eb270" (UID: "22592ee5-954e-4360-89b6-8f45892eb270"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.719330 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts" (OuterVolumeSpecName: "scripts") pod "f13e18bf-c8d3-4e58-ada2-b3014689271e" (UID: "f13e18bf-c8d3-4e58-ada2-b3014689271e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.719647 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data" (OuterVolumeSpecName: "config-data") pod "f13e18bf-c8d3-4e58-ada2-b3014689271e" (UID: "f13e18bf-c8d3-4e58-ada2-b3014689271e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.729816 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "22592ee5-954e-4360-89b6-8f45892eb270" (UID: "22592ee5-954e-4360-89b6-8f45892eb270"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.729836 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp" (OuterVolumeSpecName: "kube-api-access-mh9kp") pod "22592ee5-954e-4360-89b6-8f45892eb270" (UID: "22592ee5-954e-4360-89b6-8f45892eb270"). InnerVolumeSpecName "kube-api-access-mh9kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.735356 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf" (OuterVolumeSpecName: "kube-api-access-v72hf") pod "f13e18bf-c8d3-4e58-ada2-b3014689271e" (UID: "f13e18bf-c8d3-4e58-ada2-b3014689271e"). InnerVolumeSpecName "kube-api-access-v72hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.736034 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f13e18bf-c8d3-4e58-ada2-b3014689271e" (UID: "f13e18bf-c8d3-4e58-ada2-b3014689271e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.742644 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g57l7"] Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743115 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4026f9-ee20-44aa-9575-b1e64680139a" containerName="mariadb-account-create" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743140 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4026f9-ee20-44aa-9575-b1e64680139a" containerName="mariadb-account-create" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743163 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="init" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743173 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="init" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743194 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-log" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743203 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-log" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743232 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-httpd" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743240 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-httpd" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743253 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943687e2-eefb-46b9-8595-224dc883d780" containerName="keystone-bootstrap" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743260 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="943687e2-eefb-46b9-8595-224dc883d780" containerName="keystone-bootstrap" Sep 29 10:05:35 crc kubenswrapper[4891]: E0929 10:05:35.743283 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="dnsmasq-dns" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743292 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="dnsmasq-dns" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743616 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" containerName="dnsmasq-dns" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743640 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4026f9-ee20-44aa-9575-b1e64680139a" containerName="mariadb-account-create" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743666 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-log" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743681 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" containerName="glance-httpd" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.743699 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="943687e2-eefb-46b9-8595-224dc883d780" containerName="keystone-bootstrap" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.744706 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.746870 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.747039 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.747157 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-trmdq" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.747331 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.761273 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g57l7"] Sep 29 10:05:35 crc kubenswrapper[4891]: W0929 10:05:35.819224 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd464aff7_6448_4eaf_b88e_01a8acc3e42a.slice/crio-83c2d004cad7a9c7587924d49283013196db263ff204d435cc58b4bbd3659a90 WatchSource:0}: Error finding container 83c2d004cad7a9c7587924d49283013196db263ff204d435cc58b4bbd3659a90: Status 404 returned error can't find the container with id 83c2d004cad7a9c7587924d49283013196db263ff204d435cc58b4bbd3659a90 Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.819577 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.819856 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.819921 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.819986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820021 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820052 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjn5k\" (UniqueName: \"kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820155 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820290 4891 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22592ee5-954e-4360-89b6-8f45892eb270-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820308 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820322 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72hf\" (UniqueName: \"kubernetes.io/projected/f13e18bf-c8d3-4e58-ada2-b3014689271e-kube-api-access-v72hf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820336 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820347 4891 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f13e18bf-c8d3-4e58-ada2-b3014689271e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820360 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f13e18bf-c8d3-4e58-ada2-b3014689271e-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820371 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22592ee5-954e-4360-89b6-8f45892eb270-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820386 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22592ee5-954e-4360-89b6-8f45892eb270-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820397 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e18bf-c8d3-4e58-ada2-b3014689271e-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.820408 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/22592ee5-954e-4360-89b6-8f45892eb270-kube-api-access-mh9kp\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.828542 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d8cd8ff44-d8rc8"] Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.838369 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922209 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922286 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922353 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjn5k\" (UniqueName: \"kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922444 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922519 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.922550 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.928130 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.928380 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.928862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.936443 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.937537 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:35 crc kubenswrapper[4891]: I0929 10:05:35.945973 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjn5k\" (UniqueName: \"kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k\") pod \"keystone-bootstrap-g57l7\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.049281 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c9454dcd5-bk2r4" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.049280 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c9454dcd5-bk2r4" event={"ID":"f13e18bf-c8d3-4e58-ada2-b3014689271e","Type":"ContainerDied","Data":"66bc133c77c0c9128ee13ff624e0a0aab090fa0c9eb8f543f117ec861f00a064"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.054919 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00983c03-0370-4feb-a9aa-c3d6cd63c49d","Type":"ContainerDied","Data":"cd72b4f5ae38bcbd8c4e401f970dc04e041043a4d22e848fd0d7b7925fc47239"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.054998 4891 scope.go:117] "RemoveContainer" containerID="cdb0b1b8c07ad6f8cab1bc1ccc586e88c894142d4d2cd29eea693dae98b94df1" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.055260 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.066964 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerStarted","Data":"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.068782 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" event={"ID":"a59c7eb8-6677-4725-8a96-6920e1b84c83","Type":"ContainerDied","Data":"e013818abd86e368764516151f9b80e0205a5be5a583f43b644b6c846988fcc8"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.068913 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-r7nq7" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.077030 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c8665cc-jf4tb" event={"ID":"22592ee5-954e-4360-89b6-8f45892eb270","Type":"ContainerDied","Data":"d7b877397347f964bf750bff0d74b8cee4f3c86f63570ac67ef6984217b71bb2"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.077082 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c8665cc-jf4tb" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.078813 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cbfd6f48f-lw2wd" event={"ID":"47e58639-672a-4281-b7da-5363647cb329","Type":"ContainerDied","Data":"ec7be4746ff750d55e4d22ac749abd9d305a1eecfef6845d2141e0c3a0c2cc60"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.078843 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cbfd6f48f-lw2wd" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.081287 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerStarted","Data":"4d00c6751c79e4c2551fc97c295fd2202c25cc1cb80e59dc253c6cba8392c15d"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.083768 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d8cd8ff44-d8rc8" event={"ID":"d464aff7-6448-4eaf-b88e-01a8acc3e42a","Type":"ContainerStarted","Data":"83c2d004cad7a9c7587924d49283013196db263ff204d435cc58b4bbd3659a90"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.086576 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerStarted","Data":"33f370cf6b4c923624a71e57f568bd1cdea62ff07271dd7ea4b4997204c9e0d8"} Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.092668 4891 scope.go:117] "RemoveContainer" containerID="4772e9da12668efa5f475d6eeb27a333d78ca81c29463d6129782f039ea2af55" Sep 29 10:05:36 crc kubenswrapper[4891]: E0929 10:05:36.095020 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-s4bk8" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.134284 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.149748 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c9454dcd5-bk2r4"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.169270 4891 scope.go:117] "RemoveContainer" containerID="ebcfbfb0783dfbcba0948383f93b72cc28f36b28cf9522ee0cba3a18c02b544b" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.172147 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.196385 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.213087 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.239158 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.241492 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.245155 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.245353 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.251070 4891 scope.go:117] "RemoveContainer" containerID="3b1fd89b22e9b20a14712190d1a12ba9f8a562012c4c0e299f2df9f9e13a268b" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.251953 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.280087 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.288041 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86c8665cc-jf4tb"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.312616 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.321513 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cbfd6f48f-lw2wd"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.330825 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335053 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335138 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335181 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlczz\" (UniqueName: \"kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335255 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335338 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335362 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335405 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.335440 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.338780 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-r7nq7"] Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.414343 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00983c03-0370-4feb-a9aa-c3d6cd63c49d" path="/var/lib/kubelet/pods/00983c03-0370-4feb-a9aa-c3d6cd63c49d/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.416461 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22592ee5-954e-4360-89b6-8f45892eb270" path="/var/lib/kubelet/pods/22592ee5-954e-4360-89b6-8f45892eb270/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.416924 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e58639-672a-4281-b7da-5363647cb329" path="/var/lib/kubelet/pods/47e58639-672a-4281-b7da-5363647cb329/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.417266 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943687e2-eefb-46b9-8595-224dc883d780" path="/var/lib/kubelet/pods/943687e2-eefb-46b9-8595-224dc883d780/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.418577 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59c7eb8-6677-4725-8a96-6920e1b84c83" path="/var/lib/kubelet/pods/a59c7eb8-6677-4725-8a96-6920e1b84c83/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.419460 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13e18bf-c8d3-4e58-ada2-b3014689271e" path="/var/lib/kubelet/pods/f13e18bf-c8d3-4e58-ada2-b3014689271e/volumes" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437456 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437514 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437590 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437683 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437749 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437821 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437859 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlczz\" (UniqueName: \"kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.437918 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.438500 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.442283 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.443805 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.452450 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.454449 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.462546 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlczz\" (UniqueName: \"kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.462954 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.467853 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.492867 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.584932 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:36 crc kubenswrapper[4891]: I0929 10:05:36.793986 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g57l7"] Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.106986 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerStarted","Data":"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.107497 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerStarted","Data":"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.115072 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g57l7" event={"ID":"c2f25328-bcaa-4a33-b55b-f7a026e29087","Type":"ContainerStarted","Data":"39b5379b9aec7726d5eed8e1576b89438e7edade0f4fd10f0a0b4471d6fd8b6b"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.115142 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g57l7" event={"ID":"c2f25328-bcaa-4a33-b55b-f7a026e29087","Type":"ContainerStarted","Data":"96c92d3a1fa68444d99b8fccd01cebd518f3d633ce7680cb1a1fd8e4da983347"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.117820 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerStarted","Data":"cb39dff6596854a467b2bf09612734540cb43d9c2cf3ac669ad042a6177f230b"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.123725 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d8cd8ff44-d8rc8" event={"ID":"d464aff7-6448-4eaf-b88e-01a8acc3e42a","Type":"ContainerStarted","Data":"a665243c85a0d97916664869eb1874273bc33f28c05643e8c53152de2c2ad970"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.123798 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d8cd8ff44-d8rc8" event={"ID":"d464aff7-6448-4eaf-b88e-01a8acc3e42a","Type":"ContainerStarted","Data":"7839a075915e783363c8773d9c53d7ca6e9f8f1ca84a9a16332d99edd015c3ba"} Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.138271 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7975d54bd8-pl4st" podStartSLOduration=12.451722762 podStartE2EDuration="13.138247741s" podCreationTimestamp="2025-09-29 10:05:24 +0000 UTC" firstStartedPulling="2025-09-29 10:05:35.836605114 +0000 UTC m=+1066.041773435" lastFinishedPulling="2025-09-29 10:05:36.523130093 +0000 UTC m=+1066.728298414" observedRunningTime="2025-09-29 10:05:37.136901201 +0000 UTC m=+1067.342069532" watchObservedRunningTime="2025-09-29 10:05:37.138247741 +0000 UTC m=+1067.343416062" Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.159934 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g57l7" podStartSLOduration=2.159908121 podStartE2EDuration="2.159908121s" podCreationTimestamp="2025-09-29 10:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:37.153366498 +0000 UTC m=+1067.358534839" watchObservedRunningTime="2025-09-29 10:05:37.159908121 +0000 UTC m=+1067.365076442" Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.183493 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d8cd8ff44-d8rc8" podStartSLOduration=12.551926525 podStartE2EDuration="13.183469168s" podCreationTimestamp="2025-09-29 10:05:24 +0000 UTC" firstStartedPulling="2025-09-29 10:05:35.824102184 +0000 UTC m=+1066.029270505" lastFinishedPulling="2025-09-29 10:05:36.455644827 +0000 UTC m=+1066.660813148" observedRunningTime="2025-09-29 10:05:37.18152505 +0000 UTC m=+1067.386693391" watchObservedRunningTime="2025-09-29 10:05:37.183469168 +0000 UTC m=+1067.388637479" Sep 29 10:05:37 crc kubenswrapper[4891]: I0929 10:05:37.198252 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:05:37 crc kubenswrapper[4891]: W0929 10:05:37.549008 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c770aa_803b_4a67_9893_5476845e722b.slice/crio-1fe8a0d195a9efdb43e00fe480201a341fd91b9f17e4bdaaf6e37f0527da4076 WatchSource:0}: Error finding container 1fe8a0d195a9efdb43e00fe480201a341fd91b9f17e4bdaaf6e37f0527da4076: Status 404 returned error can't find the container with id 1fe8a0d195a9efdb43e00fe480201a341fd91b9f17e4bdaaf6e37f0527da4076 Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.137280 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerStarted","Data":"1fe8a0d195a9efdb43e00fe480201a341fd91b9f17e4bdaaf6e37f0527da4076"} Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.139813 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerStarted","Data":"e7acd43668b0e2b85de3e13dcab8138dd075c4d7a3ae330f5e5b2568063bf417"} Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.140047 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-log" containerID="cri-o://cb39dff6596854a467b2bf09612734540cb43d9c2cf3ac669ad042a6177f230b" gracePeriod=30 Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.140146 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-httpd" containerID="cri-o://e7acd43668b0e2b85de3e13dcab8138dd075c4d7a3ae330f5e5b2568063bf417" gracePeriod=30 Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.144839 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerStarted","Data":"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243"} Sep 29 10:05:38 crc kubenswrapper[4891]: I0929 10:05:38.175269 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.175242623 podStartE2EDuration="17.175242623s" podCreationTimestamp="2025-09-29 10:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:38.173699097 +0000 UTC m=+1068.378867428" watchObservedRunningTime="2025-09-29 10:05:38.175242623 +0000 UTC m=+1068.380410964" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.156719 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerStarted","Data":"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde"} Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.163867 4891 generic.go:334] "Generic (PLEG): container finished" podID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerID="e7acd43668b0e2b85de3e13dcab8138dd075c4d7a3ae330f5e5b2568063bf417" exitCode=0 Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.163947 4891 generic.go:334] "Generic (PLEG): container finished" podID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerID="cb39dff6596854a467b2bf09612734540cb43d9c2cf3ac669ad042a6177f230b" exitCode=143 Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.163942 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerDied","Data":"e7acd43668b0e2b85de3e13dcab8138dd075c4d7a3ae330f5e5b2568063bf417"} Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.164036 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerDied","Data":"cb39dff6596854a467b2bf09612734540cb43d9c2cf3ac669ad042a6177f230b"} Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.639057 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.707969 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708092 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708224 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4nlr\" (UniqueName: \"kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708298 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708322 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708348 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708375 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.708424 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"26933cfa-6c41-490b-85f7-a1e95cddfa96\" (UID: \"26933cfa-6c41-490b-85f7-a1e95cddfa96\") " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.709423 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.709733 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs" (OuterVolumeSpecName: "logs") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.715593 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.716029 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts" (OuterVolumeSpecName: "scripts") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.732015 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr" (OuterVolumeSpecName: "kube-api-access-w4nlr") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "kube-api-access-w4nlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.751930 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.762053 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data" (OuterVolumeSpecName: "config-data") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.781940 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "26933cfa-6c41-490b-85f7-a1e95cddfa96" (UID: "26933cfa-6c41-490b-85f7-a1e95cddfa96"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813024 4891 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813070 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813084 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4nlr\" (UniqueName: \"kubernetes.io/projected/26933cfa-6c41-490b-85f7-a1e95cddfa96-kube-api-access-w4nlr\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813101 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813115 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26933cfa-6c41-490b-85f7-a1e95cddfa96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813129 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813142 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26933cfa-6c41-490b-85f7-a1e95cddfa96-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.813188 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.835411 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 29 10:05:39 crc kubenswrapper[4891]: I0929 10:05:39.919118 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.076479 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hgm8s"] Sep 29 10:05:40 crc kubenswrapper[4891]: E0929 10:05:40.077306 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-httpd" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.077329 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-httpd" Sep 29 10:05:40 crc kubenswrapper[4891]: E0929 10:05:40.077408 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-log" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.077420 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-log" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.077842 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-httpd" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.077865 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" containerName="glance-log" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.078884 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.081300 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.082467 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jz6qc" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.082966 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.105662 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hgm8s"] Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.187145 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"26933cfa-6c41-490b-85f7-a1e95cddfa96","Type":"ContainerDied","Data":"4d00c6751c79e4c2551fc97c295fd2202c25cc1cb80e59dc253c6cba8392c15d"} Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.187224 4891 scope.go:117] "RemoveContainer" containerID="e7acd43668b0e2b85de3e13dcab8138dd075c4d7a3ae330f5e5b2568063bf417" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.187530 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.204714 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerStarted","Data":"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9"} Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225075 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225121 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225170 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225202 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225243 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-456p7\" (UniqueName: \"kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.225323 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.233894 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.233850881 podStartE2EDuration="4.233850881s" podCreationTimestamp="2025-09-29 10:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:40.229123291 +0000 UTC m=+1070.434291632" watchObservedRunningTime="2025-09-29 10:05:40.233850881 +0000 UTC m=+1070.439019202" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.240358 4891 scope.go:117] "RemoveContainer" containerID="cb39dff6596854a467b2bf09612734540cb43d9c2cf3ac669ad042a6177f230b" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.284455 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.316276 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.324935 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.326894 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.326946 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.326991 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-456p7\" (UniqueName: \"kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.327057 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.327110 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.327131 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.327154 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.327215 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.330990 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.331411 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.332128 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.341349 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.346871 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.347583 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.354843 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.365480 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-456p7\" (UniqueName: \"kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7\") pod \"cinder-db-sync-hgm8s\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.426332 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428158 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428194 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428230 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428301 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428351 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428380 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428421 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.428448 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmm6l\" (UniqueName: \"kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.431163 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26933cfa-6c41-490b-85f7-a1e95cddfa96" path="/var/lib/kubelet/pods/26933cfa-6c41-490b-85f7-a1e95cddfa96/volumes" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.530644 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.530703 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.530814 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.530860 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.530973 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.531032 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.531097 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.531136 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmm6l\" (UniqueName: \"kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.532750 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.537533 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.540780 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.541122 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.547994 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.554051 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.555359 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmm6l\" (UniqueName: \"kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.555968 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.618321 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " pod="openstack/glance-default-external-api-0" Sep 29 10:05:40 crc kubenswrapper[4891]: I0929 10:05:40.664549 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:05:41 crc kubenswrapper[4891]: I0929 10:05:41.226278 4891 generic.go:334] "Generic (PLEG): container finished" podID="c2f25328-bcaa-4a33-b55b-f7a026e29087" containerID="39b5379b9aec7726d5eed8e1576b89438e7edade0f4fd10f0a0b4471d6fd8b6b" exitCode=0 Sep 29 10:05:41 crc kubenswrapper[4891]: I0929 10:05:41.226365 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g57l7" event={"ID":"c2f25328-bcaa-4a33-b55b-f7a026e29087","Type":"ContainerDied","Data":"39b5379b9aec7726d5eed8e1576b89438e7edade0f4fd10f0a0b4471d6fd8b6b"} Sep 29 10:05:41 crc kubenswrapper[4891]: I0929 10:05:41.352818 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hgm8s"] Sep 29 10:05:41 crc kubenswrapper[4891]: W0929 10:05:41.365345 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7dd6438_e338_4dce_b2be_0e36b359631c.slice/crio-9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62 WatchSource:0}: Error finding container 9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62: Status 404 returned error can't find the container with id 9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62 Sep 29 10:05:41 crc kubenswrapper[4891]: I0929 10:05:41.573281 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:05:42 crc kubenswrapper[4891]: I0929 10:05:42.241535 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hgm8s" event={"ID":"f7dd6438-e338-4dce-b2be-0e36b359631c","Type":"ContainerStarted","Data":"9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62"} Sep 29 10:05:42 crc kubenswrapper[4891]: I0929 10:05:42.243911 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerStarted","Data":"186f06960a826dcb07dcf17adc02b3c1ac8d2c13ec642e0fb352ef4a0982656c"} Sep 29 10:05:43 crc kubenswrapper[4891]: I0929 10:05:43.256494 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerStarted","Data":"dc9d8171bcdde9cf808937e4987bab002f8d57bf3f88268920642ae422da90c6"} Sep 29 10:05:44 crc kubenswrapper[4891]: I0929 10:05:44.580998 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:44 crc kubenswrapper[4891]: I0929 10:05:44.581068 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:05:44 crc kubenswrapper[4891]: I0929 10:05:44.683667 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:44 crc kubenswrapper[4891]: I0929 10:05:44.684170 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:05:46 crc kubenswrapper[4891]: I0929 10:05:46.587095 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:46 crc kubenswrapper[4891]: I0929 10:05:46.587170 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:46 crc kubenswrapper[4891]: I0929 10:05:46.633712 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:46 crc kubenswrapper[4891]: I0929 10:05:46.651044 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.311482 4891 generic.go:334] "Generic (PLEG): container finished" podID="e641569b-322f-4157-aaf2-44d5f700234d" containerID="d53bf5890f49da04046f8b0e6e3e76b26c13f583a8914579e66d0cb000c17d7a" exitCode=0 Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.311557 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bm7bg" event={"ID":"e641569b-322f-4157-aaf2-44d5f700234d","Type":"ContainerDied","Data":"d53bf5890f49da04046f8b0e6e3e76b26c13f583a8914579e66d0cb000c17d7a"} Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.318338 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g57l7" event={"ID":"c2f25328-bcaa-4a33-b55b-f7a026e29087","Type":"ContainerDied","Data":"96c92d3a1fa68444d99b8fccd01cebd518f3d633ce7680cb1a1fd8e4da983347"} Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.318363 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96c92d3a1fa68444d99b8fccd01cebd518f3d633ce7680cb1a1fd8e4da983347" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.318381 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.318485 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.457393 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614283 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614412 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614574 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614650 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614687 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjn5k\" (UniqueName: \"kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.614710 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data\") pod \"c2f25328-bcaa-4a33-b55b-f7a026e29087\" (UID: \"c2f25328-bcaa-4a33-b55b-f7a026e29087\") " Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.622240 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k" (OuterVolumeSpecName: "kube-api-access-rjn5k") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "kube-api-access-rjn5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.622637 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.623024 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.632120 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts" (OuterVolumeSpecName: "scripts") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.649394 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data" (OuterVolumeSpecName: "config-data") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.656023 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2f25328-bcaa-4a33-b55b-f7a026e29087" (UID: "c2f25328-bcaa-4a33-b55b-f7a026e29087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716531 4891 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716574 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjn5k\" (UniqueName: \"kubernetes.io/projected/c2f25328-bcaa-4a33-b55b-f7a026e29087-kube-api-access-rjn5k\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716587 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716596 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716606 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:47 crc kubenswrapper[4891]: I0929 10:05:47.716614 4891 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2f25328-bcaa-4a33-b55b-f7a026e29087-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.328244 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerStarted","Data":"c96f65321403a5f5053ff8ebc61e35675865e2e4730ce75835100b436eb5a932"} Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.332469 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerStarted","Data":"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee"} Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.332493 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g57l7" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.368729 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.36870069 podStartE2EDuration="8.36870069s" podCreationTimestamp="2025-09-29 10:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:48.358403556 +0000 UTC m=+1078.563571897" watchObservedRunningTime="2025-09-29 10:05:48.36870069 +0000 UTC m=+1078.573869011" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.560985 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b9ccf6696-q5jmg"] Sep 29 10:05:48 crc kubenswrapper[4891]: E0929 10:05:48.561978 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f25328-bcaa-4a33-b55b-f7a026e29087" containerName="keystone-bootstrap" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.561998 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f25328-bcaa-4a33-b55b-f7a026e29087" containerName="keystone-bootstrap" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.562211 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f25328-bcaa-4a33-b55b-f7a026e29087" containerName="keystone-bootstrap" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.562886 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.565458 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.565849 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.565907 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.565988 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-trmdq" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.566179 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.566212 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.579146 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b9ccf6696-q5jmg"] Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.637956 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxjk\" (UniqueName: \"kubernetes.io/projected/b4050314-008a-4b46-93e7-2d9454fa3d89-kube-api-access-2wxjk\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.638032 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-internal-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.638097 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-credential-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.638626 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-fernet-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.638908 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-combined-ca-bundle\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.638932 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-public-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.639076 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-config-data\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.639113 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-scripts\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740488 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-combined-ca-bundle\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740546 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-public-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740595 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-config-data\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740617 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-scripts\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740665 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxjk\" (UniqueName: \"kubernetes.io/projected/b4050314-008a-4b46-93e7-2d9454fa3d89-kube-api-access-2wxjk\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740688 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-internal-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740712 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-credential-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.740732 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-fernet-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.748818 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-public-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.750184 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-internal-tls-certs\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.753166 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-credential-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.759357 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-fernet-keys\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.759901 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-combined-ca-bundle\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.760400 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-config-data\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.761535 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4050314-008a-4b46-93e7-2d9454fa3d89-scripts\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.764954 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxjk\" (UniqueName: \"kubernetes.io/projected/b4050314-008a-4b46-93e7-2d9454fa3d89-kube-api-access-2wxjk\") pod \"keystone-5b9ccf6696-q5jmg\" (UID: \"b4050314-008a-4b46-93e7-2d9454fa3d89\") " pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.853990 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.897941 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.944271 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47wk5\" (UniqueName: \"kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5\") pod \"e641569b-322f-4157-aaf2-44d5f700234d\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.944415 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle\") pod \"e641569b-322f-4157-aaf2-44d5f700234d\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.944468 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config\") pod \"e641569b-322f-4157-aaf2-44d5f700234d\" (UID: \"e641569b-322f-4157-aaf2-44d5f700234d\") " Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.952747 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5" (OuterVolumeSpecName: "kube-api-access-47wk5") pod "e641569b-322f-4157-aaf2-44d5f700234d" (UID: "e641569b-322f-4157-aaf2-44d5f700234d"). InnerVolumeSpecName "kube-api-access-47wk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.977927 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config" (OuterVolumeSpecName: "config") pod "e641569b-322f-4157-aaf2-44d5f700234d" (UID: "e641569b-322f-4157-aaf2-44d5f700234d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:48 crc kubenswrapper[4891]: I0929 10:05:48.978087 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e641569b-322f-4157-aaf2-44d5f700234d" (UID: "e641569b-322f-4157-aaf2-44d5f700234d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.046613 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.046656 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e641569b-322f-4157-aaf2-44d5f700234d-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.046671 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47wk5\" (UniqueName: \"kubernetes.io/projected/e641569b-322f-4157-aaf2-44d5f700234d-kube-api-access-47wk5\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.382829 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bm7bg" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.383282 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bm7bg" event={"ID":"e641569b-322f-4157-aaf2-44d5f700234d","Type":"ContainerDied","Data":"834e8e0d50f9e1347cd44f91cd13b97e6442ded6c0c8beaa570743074308aee1"} Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.383340 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834e8e0d50f9e1347cd44f91cd13b97e6442ded6c0c8beaa570743074308aee1" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.383473 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.383482 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.501186 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b9ccf6696-q5jmg"] Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.658492 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:05:49 crc kubenswrapper[4891]: E0929 10:05:49.659444 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641569b-322f-4157-aaf2-44d5f700234d" containerName="neutron-db-sync" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.659463 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641569b-322f-4157-aaf2-44d5f700234d" containerName="neutron-db-sync" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.659683 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e641569b-322f-4157-aaf2-44d5f700234d" containerName="neutron-db-sync" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.664257 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.701811 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.784208 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.791866 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.797845 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.798290 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-g9w5p" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.798538 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.799607 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2rq\" (UniqueName: \"kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.799834 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.800004 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.800067 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.800143 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.800257 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.815753 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.823641 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902199 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902494 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902618 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902758 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902885 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gt9t\" (UniqueName: \"kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.902990 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.903087 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.903160 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2rq\" (UniqueName: \"kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.903257 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.903342 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.904107 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:49 crc kubenswrapper[4891]: I0929 10:05:49.993963 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.003902 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.003929 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.003956 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.005947 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gt9t\" (UniqueName: \"kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.006011 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.006057 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.006115 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.006176 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.007997 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.013683 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.016733 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.017551 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.020756 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.026515 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gt9t\" (UniqueName: \"kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t\") pod \"neutron-7f8b757f88-7hxnh\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.082928 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2rq\" (UniqueName: \"kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq\") pod \"dnsmasq-dns-5ccc5c4795-qjfvh\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.157377 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.327457 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.428446 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9ccf6696-q5jmg" event={"ID":"b4050314-008a-4b46-93e7-2d9454fa3d89","Type":"ContainerStarted","Data":"3877983d4f45904229e2701e00a3cca6595d0c0581215ad5b805aefc9f9ff3f4"} Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.429032 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9ccf6696-q5jmg" event={"ID":"b4050314-008a-4b46-93e7-2d9454fa3d89","Type":"ContainerStarted","Data":"fb87d60094c289e3f568de45ce696972e42cb6db2c6e8a47229fb8ef3429a7f3"} Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.666625 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.666744 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.707089 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.763655 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.883198 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.883334 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:05:50 crc kubenswrapper[4891]: I0929 10:05:50.935158 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.058003 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.262478 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.459705 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" event={"ID":"7e2bae28-f861-42b0-8fff-59b6516f85ff","Type":"ContainerStarted","Data":"f8bcb9f25f9c1e7b13decbef5b11e131ea130fe869b43d81a5de1bcd0a0d01fa"} Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.464599 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerStarted","Data":"679740f3aae0740c12f0a2a8a0cd94b4e432284320733ba79ec2375d25607f35"} Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.466698 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.466744 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.466760 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:05:51 crc kubenswrapper[4891]: I0929 10:05:51.511155 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b9ccf6696-q5jmg" podStartSLOduration=3.511122825 podStartE2EDuration="3.511122825s" podCreationTimestamp="2025-09-29 10:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:51.491121773 +0000 UTC m=+1081.696290104" watchObservedRunningTime="2025-09-29 10:05:51.511122825 +0000 UTC m=+1081.716291156" Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.504927 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerStarted","Data":"b2f500907188a05e44bfb23e95e794f24e6465960eddc5cf8ab4ec28d770445a"} Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.505981 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerStarted","Data":"b670472af43fb7a0fe22de0471be6eaf4c651568002cb2dac21913a7fed90a6f"} Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.506114 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.528006 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s4bk8" event={"ID":"5872fd60-b8c9-4f00-8c9a-679960a32e27","Type":"ContainerStarted","Data":"037c02798e133e92a31e44733436f13ca27a498b8569fab2b48c805aa0594c30"} Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.549083 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv6jw" event={"ID":"fb88c2dd-0bb3-4425-842f-b697d51f8273","Type":"ContainerStarted","Data":"5ea6368b877c7f0c81501f74137a9cefa3020a0725510ca09dfdf7ff6ab644da"} Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.550261 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f8b757f88-7hxnh" podStartSLOduration=3.550239569 podStartE2EDuration="3.550239569s" podCreationTimestamp="2025-09-29 10:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:52.532746032 +0000 UTC m=+1082.737914353" watchObservedRunningTime="2025-09-29 10:05:52.550239569 +0000 UTC m=+1082.755407900" Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.567231 4891 generic.go:334] "Generic (PLEG): container finished" podID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerID="37ae262c85e8f74c37d5ca8a66a0cec073822408fdf83d62c18a9ebc0e42fe90" exitCode=0 Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.569897 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" event={"ID":"7e2bae28-f861-42b0-8fff-59b6516f85ff","Type":"ContainerDied","Data":"37ae262c85e8f74c37d5ca8a66a0cec073822408fdf83d62c18a9ebc0e42fe90"} Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.591691 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-s4bk8" podStartSLOduration=3.073355663 podStartE2EDuration="37.591668474s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="2025-09-29 10:05:17.014217887 +0000 UTC m=+1047.219386218" lastFinishedPulling="2025-09-29 10:05:51.532530708 +0000 UTC m=+1081.737699029" observedRunningTime="2025-09-29 10:05:52.56176854 +0000 UTC m=+1082.766936851" watchObservedRunningTime="2025-09-29 10:05:52.591668474 +0000 UTC m=+1082.796836795" Sep 29 10:05:52 crc kubenswrapper[4891]: I0929 10:05:52.651113 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jv6jw" podStartSLOduration=3.128401681 podStartE2EDuration="37.651083451s" podCreationTimestamp="2025-09-29 10:05:15 +0000 UTC" firstStartedPulling="2025-09-29 10:05:16.644353031 +0000 UTC m=+1046.849521352" lastFinishedPulling="2025-09-29 10:05:51.167034801 +0000 UTC m=+1081.372203122" observedRunningTime="2025-09-29 10:05:52.580907596 +0000 UTC m=+1082.786075917" watchObservedRunningTime="2025-09-29 10:05:52.651083451 +0000 UTC m=+1082.856251782" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.248070 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56d6cd75c7-6j75x"] Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.256151 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.261081 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.264452 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.293124 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56d6cd75c7-6j75x"] Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.340693 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-ovndb-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341000 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341053 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-public-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341130 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-combined-ca-bundle\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341149 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6dq8\" (UniqueName: \"kubernetes.io/projected/af72b6bb-1073-4ceb-b593-209e646bba5a-kube-api-access-f6dq8\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341188 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-httpd-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.341205 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-internal-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.443610 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.443940 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-public-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.444042 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-combined-ca-bundle\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.444111 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6dq8\" (UniqueName: \"kubernetes.io/projected/af72b6bb-1073-4ceb-b593-209e646bba5a-kube-api-access-f6dq8\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.444200 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-internal-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.444272 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-httpd-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.444374 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-ovndb-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.451188 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-combined-ca-bundle\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.461030 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.474812 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-ovndb-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.475012 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-internal-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.475026 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-public-tls-certs\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.475745 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6dq8\" (UniqueName: \"kubernetes.io/projected/af72b6bb-1073-4ceb-b593-209e646bba5a-kube-api-access-f6dq8\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.476359 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af72b6bb-1073-4ceb-b593-209e646bba5a-httpd-config\") pod \"neutron-56d6cd75c7-6j75x\" (UID: \"af72b6bb-1073-4ceb-b593-209e646bba5a\") " pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.629938 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" event={"ID":"7e2bae28-f861-42b0-8fff-59b6516f85ff","Type":"ContainerStarted","Data":"ad342bf4983501bf4838c89a78fd3408436b7bbd7c8e4311706a8ad829a18622"} Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.630344 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.631085 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.633369 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:05:53 crc kubenswrapper[4891]: I0929 10:05:53.662248 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" podStartSLOduration=4.662220468 podStartE2EDuration="4.662220468s" podCreationTimestamp="2025-09-29 10:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:53.654554841 +0000 UTC m=+1083.859723162" watchObservedRunningTime="2025-09-29 10:05:53.662220468 +0000 UTC m=+1083.867388789" Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.342228 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56d6cd75c7-6j75x"] Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.583159 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.648954 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56d6cd75c7-6j75x" event={"ID":"af72b6bb-1073-4ceb-b593-209e646bba5a","Type":"ContainerStarted","Data":"cdbe2985135bb485ddaf242f06ab46e6e09046a4b1a17d8ce90eb2117e646de4"} Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.682822 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.682989 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.686925 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:05:54 crc kubenswrapper[4891]: I0929 10:05:54.690746 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d8cd8ff44-d8rc8" podUID="d464aff7-6448-4eaf-b88e-01a8acc3e42a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 29 10:05:55 crc kubenswrapper[4891]: I0929 10:05:55.661405 4891 generic.go:334] "Generic (PLEG): container finished" podID="fb88c2dd-0bb3-4425-842f-b697d51f8273" containerID="5ea6368b877c7f0c81501f74137a9cefa3020a0725510ca09dfdf7ff6ab644da" exitCode=0 Sep 29 10:05:55 crc kubenswrapper[4891]: I0929 10:05:55.661475 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv6jw" event={"ID":"fb88c2dd-0bb3-4425-842f-b697d51f8273","Type":"ContainerDied","Data":"5ea6368b877c7f0c81501f74137a9cefa3020a0725510ca09dfdf7ff6ab644da"} Sep 29 10:05:56 crc kubenswrapper[4891]: I0929 10:05:56.675971 4891 generic.go:334] "Generic (PLEG): container finished" podID="5872fd60-b8c9-4f00-8c9a-679960a32e27" containerID="037c02798e133e92a31e44733436f13ca27a498b8569fab2b48c805aa0594c30" exitCode=0 Sep 29 10:05:56 crc kubenswrapper[4891]: I0929 10:05:56.676047 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s4bk8" event={"ID":"5872fd60-b8c9-4f00-8c9a-679960a32e27","Type":"ContainerDied","Data":"037c02798e133e92a31e44733436f13ca27a498b8569fab2b48c805aa0594c30"} Sep 29 10:06:00 crc kubenswrapper[4891]: I0929 10:06:00.331131 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:06:00 crc kubenswrapper[4891]: I0929 10:06:00.393631 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:06:00 crc kubenswrapper[4891]: I0929 10:06:00.393966 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" containerID="cri-o://1d3f9e19f41504fccd8bb715da857cc470817c51340a28de447051e0e4a720f7" gracePeriod=10 Sep 29 10:06:00 crc kubenswrapper[4891]: I0929 10:06:00.743003 4891 generic.go:334] "Generic (PLEG): container finished" podID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerID="1d3f9e19f41504fccd8bb715da857cc470817c51340a28de447051e0e4a720f7" exitCode=0 Sep 29 10:06:00 crc kubenswrapper[4891]: I0929 10:06:00.743405 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" event={"ID":"048fe6e5-9e83-456c-965f-d4b0a7378b02","Type":"ContainerDied","Data":"1d3f9e19f41504fccd8bb715da857cc470817c51340a28de447051e0e4a720f7"} Sep 29 10:06:01 crc kubenswrapper[4891]: I0929 10:06:01.107119 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Sep 29 10:06:06 crc kubenswrapper[4891]: E0929 10:06:06.059501 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 29 10:06:06 crc kubenswrapper[4891]: E0929 10:06:06.060104 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-456p7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-hgm8s_openstack(f7dd6438-e338-4dce-b2be-0e36b359631c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:06:06 crc kubenswrapper[4891]: E0929 10:06:06.061208 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-hgm8s" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.098855 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.107274 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.194639 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data\") pod \"5872fd60-b8c9-4f00-8c9a-679960a32e27\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.194765 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4sj8\" (UniqueName: \"kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8\") pod \"5872fd60-b8c9-4f00-8c9a-679960a32e27\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.194947 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle\") pod \"5872fd60-b8c9-4f00-8c9a-679960a32e27\" (UID: \"5872fd60-b8c9-4f00-8c9a-679960a32e27\") " Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.213922 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8" (OuterVolumeSpecName: "kube-api-access-q4sj8") pod "5872fd60-b8c9-4f00-8c9a-679960a32e27" (UID: "5872fd60-b8c9-4f00-8c9a-679960a32e27"). InnerVolumeSpecName "kube-api-access-q4sj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.214748 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5872fd60-b8c9-4f00-8c9a-679960a32e27" (UID: "5872fd60-b8c9-4f00-8c9a-679960a32e27"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.229516 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5872fd60-b8c9-4f00-8c9a-679960a32e27" (UID: "5872fd60-b8c9-4f00-8c9a-679960a32e27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.302300 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.302658 4891 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5872fd60-b8c9-4f00-8c9a-679960a32e27-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.302757 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4sj8\" (UniqueName: \"kubernetes.io/projected/5872fd60-b8c9-4f00-8c9a-679960a32e27-kube-api-access-q4sj8\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.757604 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.797101 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.805754 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s4bk8" event={"ID":"5872fd60-b8c9-4f00-8c9a-679960a32e27","Type":"ContainerDied","Data":"908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674"} Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.805811 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="908cbaacda7df76988b468af030768ef7108fc0bf1a730d129d15b1857516674" Sep 29 10:06:06 crc kubenswrapper[4891]: I0929 10:06:06.805885 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s4bk8" Sep 29 10:06:06 crc kubenswrapper[4891]: E0929 10:06:06.810138 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-hgm8s" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.400936 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-677dd7cdbc-drpcv"] Sep 29 10:06:07 crc kubenswrapper[4891]: E0929 10:06:07.401365 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" containerName="barbican-db-sync" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.401378 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" containerName="barbican-db-sync" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.401572 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" containerName="barbican-db-sync" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.402567 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.406501 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.406721 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.406850 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmv66" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.410927 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-748f5656b6-pdpff"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.413193 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.421538 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.438390 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-677dd7cdbc-drpcv"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.442774 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhp4\" (UniqueName: \"kubernetes.io/projected/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-kube-api-access-ffhp4\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.442871 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.442897 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.445516 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data-custom\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.445630 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-748f5656b6-pdpff"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.445891 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pmzd\" (UniqueName: \"kubernetes.io/projected/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-kube-api-access-2pmzd\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.446045 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-logs\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.446211 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data-custom\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.446314 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-combined-ca-bundle\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.446436 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-logs\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.446518 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-combined-ca-bundle\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.524780 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.526402 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.542715 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.556947 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pmzd\" (UniqueName: \"kubernetes.io/projected/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-kube-api-access-2pmzd\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557013 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-logs\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557088 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data-custom\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557168 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-combined-ca-bundle\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557227 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-logs\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557774 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-logs\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.557247 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-combined-ca-bundle\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.558583 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhp4\" (UniqueName: \"kubernetes.io/projected/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-kube-api-access-ffhp4\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.558718 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.558808 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.558885 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data-custom\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.559630 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-logs\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.573486 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-combined-ca-bundle\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.573606 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-combined-ca-bundle\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.583186 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data-custom\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.585130 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-config-data\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.588204 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data-custom\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.589820 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-config-data\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.604431 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pmzd\" (UniqueName: \"kubernetes.io/projected/e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99-kube-api-access-2pmzd\") pod \"barbican-worker-677dd7cdbc-drpcv\" (UID: \"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99\") " pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.609997 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhp4\" (UniqueName: \"kubernetes.io/projected/d8d04caa-6db6-41c2-bf9b-f5ed373e9799-kube-api-access-ffhp4\") pod \"barbican-keystone-listener-748f5656b6-pdpff\" (UID: \"d8d04caa-6db6-41c2-bf9b-f5ed373e9799\") " pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.619996 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.622477 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.626110 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.631530 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661272 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzk6\" (UniqueName: \"kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661323 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661349 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661392 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661428 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661461 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt77m\" (UniqueName: \"kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661482 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661498 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661527 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661549 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.661605 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.744028 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-677dd7cdbc-drpcv" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.753136 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763670 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzk6\" (UniqueName: \"kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763742 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763776 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763839 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763887 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763931 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt77m\" (UniqueName: \"kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763956 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.763978 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.764015 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.764054 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.764126 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.764573 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.765361 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.765506 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.765657 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.766614 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.769035 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.770161 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.774185 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.794369 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzk6\" (UniqueName: \"kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.797313 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt77m\" (UniqueName: \"kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m\") pod \"dnsmasq-dns-688c87cc99-hwd4m\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.798530 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle\") pod \"barbican-api-5bbdbf46b6-mv6rb\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:07 crc kubenswrapper[4891]: I0929 10:06:07.859320 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.024725 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.818640 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d8cd8ff44-d8rc8" Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.907197 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.907781 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon-log" containerID="cri-o://a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6" gracePeriod=30 Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.908016 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" containerID="cri-o://40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865" gracePeriod=30 Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.931604 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 29 10:06:08 crc kubenswrapper[4891]: I0929 10:06:08.938232 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:55752->10.217.0.148:8443: read: connection reset by peer" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.197220 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ff9945478-9v77b"] Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.198931 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.201431 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.201736 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.215728 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff9945478-9v77b"] Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.324966 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-combined-ca-bundle\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325031 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsmj\" (UniqueName: \"kubernetes.io/projected/7efda6e6-9019-4909-96be-068496b2577f-kube-api-access-7qsmj\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325133 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325173 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-public-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325204 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-internal-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325218 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data-custom\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.325238 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efda6e6-9019-4909-96be-068496b2577f-logs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.427038 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data-custom\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.427094 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efda6e6-9019-4909-96be-068496b2577f-logs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.427810 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efda6e6-9019-4909-96be-068496b2577f-logs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.427990 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-combined-ca-bundle\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.428025 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsmj\" (UniqueName: \"kubernetes.io/projected/7efda6e6-9019-4909-96be-068496b2577f-kube-api-access-7qsmj\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.428594 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.428668 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-public-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.428726 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-internal-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.433710 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data-custom\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.433808 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-config-data\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.434535 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-public-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.434650 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-combined-ca-bundle\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.438153 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efda6e6-9019-4909-96be-068496b2577f-internal-tls-certs\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.468600 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsmj\" (UniqueName: \"kubernetes.io/projected/7efda6e6-9019-4909-96be-068496b2577f-kube-api-access-7qsmj\") pod \"barbican-api-7ff9945478-9v77b\" (UID: \"7efda6e6-9019-4909-96be-068496b2577f\") " pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:10 crc kubenswrapper[4891]: I0929 10:06:10.530089 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.110057 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.110167 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.160116 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv6jw" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.276929 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts\") pod \"fb88c2dd-0bb3-4425-842f-b697d51f8273\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.277262 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfr6b\" (UniqueName: \"kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b\") pod \"fb88c2dd-0bb3-4425-842f-b697d51f8273\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.277322 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data\") pod \"fb88c2dd-0bb3-4425-842f-b697d51f8273\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.277354 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle\") pod \"fb88c2dd-0bb3-4425-842f-b697d51f8273\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.277386 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs\") pod \"fb88c2dd-0bb3-4425-842f-b697d51f8273\" (UID: \"fb88c2dd-0bb3-4425-842f-b697d51f8273\") " Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.278052 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs" (OuterVolumeSpecName: "logs") pod "fb88c2dd-0bb3-4425-842f-b697d51f8273" (UID: "fb88c2dd-0bb3-4425-842f-b697d51f8273"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.283913 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b" (OuterVolumeSpecName: "kube-api-access-cfr6b") pod "fb88c2dd-0bb3-4425-842f-b697d51f8273" (UID: "fb88c2dd-0bb3-4425-842f-b697d51f8273"). InnerVolumeSpecName "kube-api-access-cfr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.286366 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts" (OuterVolumeSpecName: "scripts") pod "fb88c2dd-0bb3-4425-842f-b697d51f8273" (UID: "fb88c2dd-0bb3-4425-842f-b697d51f8273"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.329691 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data" (OuterVolumeSpecName: "config-data") pod "fb88c2dd-0bb3-4425-842f-b697d51f8273" (UID: "fb88c2dd-0bb3-4425-842f-b697d51f8273"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.340265 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb88c2dd-0bb3-4425-842f-b697d51f8273" (UID: "fb88c2dd-0bb3-4425-842f-b697d51f8273"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.379771 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.379831 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfr6b\" (UniqueName: \"kubernetes.io/projected/fb88c2dd-0bb3-4425-842f-b697d51f8273-kube-api-access-cfr6b\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.379845 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.379857 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb88c2dd-0bb3-4425-842f-b697d51f8273-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.379868 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb88c2dd-0bb3-4425-842f-b697d51f8273-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.871238 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv6jw" event={"ID":"fb88c2dd-0bb3-4425-842f-b697d51f8273","Type":"ContainerDied","Data":"edba210b63d0d5b6fde69a866c8df40d894a08465365b4c5e2e97a658901bf0e"} Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.871284 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edba210b63d0d5b6fde69a866c8df40d894a08465365b4c5e2e97a658901bf0e" Sep 29 10:06:11 crc kubenswrapper[4891]: I0929 10:06:11.871352 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv6jw" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.296124 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.304993 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-696f7ffc96-xhjxt"] Sep 29 10:06:12 crc kubenswrapper[4891]: E0929 10:06:12.305502 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.305523 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" Sep 29 10:06:12 crc kubenswrapper[4891]: E0929 10:06:12.305540 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" containerName="placement-db-sync" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.305548 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" containerName="placement-db-sync" Sep 29 10:06:12 crc kubenswrapper[4891]: E0929 10:06:12.305571 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="init" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.305580 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="init" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.305839 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" containerName="dnsmasq-dns" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.305857 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" containerName="placement-db-sync" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.307149 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.310391 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.310690 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.310933 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.311008 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfx6v" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.311066 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.336586 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-696f7ffc96-xhjxt"] Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.415653 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.415834 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrttv\" (UniqueName: \"kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.415900 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.415937 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.416084 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.416196 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.420170 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-config-data\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.420225 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-public-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.420583 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h54zf\" (UniqueName: \"kubernetes.io/projected/e8ec980b-adab-4378-a632-0de5186250dd-kube-api-access-h54zf\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.422411 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec980b-adab-4378-a632-0de5186250dd-logs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.422494 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-scripts\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.422535 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-internal-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.422570 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-combined-ca-bundle\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.447851 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv" (OuterVolumeSpecName: "kube-api-access-jrttv") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "kube-api-access-jrttv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.488138 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.496873 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.497034 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.524244 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config" (OuterVolumeSpecName: "config") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.524535 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") pod \"048fe6e5-9e83-456c-965f-d4b0a7378b02\" (UID: \"048fe6e5-9e83-456c-965f-d4b0a7378b02\") " Sep 29 10:06:12 crc kubenswrapper[4891]: W0929 10:06:12.524667 4891 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/048fe6e5-9e83-456c-965f-d4b0a7378b02/volumes/kubernetes.io~configmap/config Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.524687 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config" (OuterVolumeSpecName: "config") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.524908 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-scripts\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.524944 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-internal-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525017 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-combined-ca-bundle\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525093 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-config-data\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525116 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-public-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525251 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h54zf\" (UniqueName: \"kubernetes.io/projected/e8ec980b-adab-4378-a632-0de5186250dd-kube-api-access-h54zf\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525331 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec980b-adab-4378-a632-0de5186250dd-logs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525379 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrttv\" (UniqueName: \"kubernetes.io/projected/048fe6e5-9e83-456c-965f-d4b0a7378b02-kube-api-access-jrttv\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525390 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525399 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525408 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.525419 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.526019 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ec980b-adab-4378-a632-0de5186250dd-logs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.530479 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "048fe6e5-9e83-456c-965f-d4b0a7378b02" (UID: "048fe6e5-9e83-456c-965f-d4b0a7378b02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.532274 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-internal-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.532691 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-public-tls-certs\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.532847 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-combined-ca-bundle\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.533131 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-scripts\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.542578 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ec980b-adab-4378-a632-0de5186250dd-config-data\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.547350 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h54zf\" (UniqueName: \"kubernetes.io/projected/e8ec980b-adab-4378-a632-0de5186250dd-kube-api-access-h54zf\") pod \"placement-696f7ffc96-xhjxt\" (UID: \"e8ec980b-adab-4378-a632-0de5186250dd\") " pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.628040 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/048fe6e5-9e83-456c-965f-d4b0a7378b02-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.641424 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.882349 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" event={"ID":"048fe6e5-9e83-456c-965f-d4b0a7378b02","Type":"ContainerDied","Data":"75f25d90bb1ff838a5163ffe9ef33db46cd7331529fe7fc92e95e16b45715f6a"} Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.882417 4891 scope.go:117] "RemoveContainer" containerID="1d3f9e19f41504fccd8bb715da857cc470817c51340a28de447051e0e4a720f7" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.882593 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-tkdcz" Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.897936 4891 generic.go:334] "Generic (PLEG): container finished" podID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerID="40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865" exitCode=0 Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.897995 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerDied","Data":"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865"} Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.923841 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:06:12 crc kubenswrapper[4891]: I0929 10:06:12.935318 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-tkdcz"] Sep 29 10:06:13 crc kubenswrapper[4891]: E0929 10:06:13.176677 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Sep 29 10:06:13 crc kubenswrapper[4891]: E0929 10:06:13.177044 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tkzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(84c6c4b8-5f24-48db-884c-dd0669cb67cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 10:06:13 crc kubenswrapper[4891]: E0929 10:06:13.178309 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.365527 4891 scope.go:117] "RemoveContainer" containerID="4d2e676f7926ea0e511b8a90a42809a09df26ec24c0e6b7ff34702547496b1c8" Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.839870 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-748f5656b6-pdpff"] Sep 29 10:06:13 crc kubenswrapper[4891]: W0929 10:06:13.845481 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8d04caa_6db6_41c2_bf9b_f5ed373e9799.slice/crio-da0a7b5528e96203d643e8455af7eae55cd9baf17f64f4eb5b6e8bd3c662efed WatchSource:0}: Error finding container da0a7b5528e96203d643e8455af7eae55cd9baf17f64f4eb5b6e8bd3c662efed: Status 404 returned error can't find the container with id da0a7b5528e96203d643e8455af7eae55cd9baf17f64f4eb5b6e8bd3c662efed Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.911369 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" event={"ID":"d8d04caa-6db6-41c2-bf9b-f5ed373e9799","Type":"ContainerStarted","Data":"da0a7b5528e96203d643e8455af7eae55cd9baf17f64f4eb5b6e8bd3c662efed"} Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.917250 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-central-agent" containerID="cri-o://5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d" gracePeriod=30 Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.918418 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56d6cd75c7-6j75x" event={"ID":"af72b6bb-1073-4ceb-b593-209e646bba5a","Type":"ContainerStarted","Data":"8475e448553910e7ae310503e9ecc1529680351db26d33ef7481fb12a6096f69"} Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.918458 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.918470 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56d6cd75c7-6j75x" event={"ID":"af72b6bb-1073-4ceb-b593-209e646bba5a","Type":"ContainerStarted","Data":"f2b703318e3f90e75a6d3a37e84b779aaa9d298fc0d58f5598ed61706a6cd9ea"} Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.918778 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="sg-core" containerID="cri-o://9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee" gracePeriod=30 Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.918879 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-notification-agent" containerID="cri-o://141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243" gracePeriod=30 Sep 29 10:06:13 crc kubenswrapper[4891]: I0929 10:06:13.964382 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56d6cd75c7-6j75x" podStartSLOduration=20.964367868 podStartE2EDuration="20.964367868s" podCreationTimestamp="2025-09-29 10:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:13.962520824 +0000 UTC m=+1104.167689145" watchObservedRunningTime="2025-09-29 10:06:13.964367868 +0000 UTC m=+1104.169536189" Sep 29 10:06:14 crc kubenswrapper[4891]: W0929 10:06:14.253671 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ec980b_adab_4378_a632_0de5186250dd.slice/crio-e58142794d49478db0b1f088d79787c6165bfc3585fc1e7e34eff7937c1244fd WatchSource:0}: Error finding container e58142794d49478db0b1f088d79787c6165bfc3585fc1e7e34eff7937c1244fd: Status 404 returned error can't find the container with id e58142794d49478db0b1f088d79787c6165bfc3585fc1e7e34eff7937c1244fd Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.256285 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.265745 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-677dd7cdbc-drpcv"] Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.275468 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.287231 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-696f7ffc96-xhjxt"] Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.358543 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff9945478-9v77b"] Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.417982 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048fe6e5-9e83-456c-965f-d4b0a7378b02" path="/var/lib/kubelet/pods/048fe6e5-9e83-456c-965f-d4b0a7378b02/volumes" Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.580801 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.939967 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-677dd7cdbc-drpcv" event={"ID":"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99","Type":"ContainerStarted","Data":"c840fe881d67adeb290df4eb75e879ff5917f042a6bc32fe75b2e6bbe36c8c23"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.947415 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff9945478-9v77b" event={"ID":"7efda6e6-9019-4909-96be-068496b2577f","Type":"ContainerStarted","Data":"ad23b6eb42f60074e30123b5c4f360575f851c7dc2018fa44d8f7fbe095e8cbb"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.960336 4891 generic.go:334] "Generic (PLEG): container finished" podID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerID="9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee" exitCode=2 Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.960376 4891 generic.go:334] "Generic (PLEG): container finished" podID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerID="5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d" exitCode=0 Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.960419 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerDied","Data":"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.960469 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerDied","Data":"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.972078 4891 generic.go:334] "Generic (PLEG): container finished" podID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerID="9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9" exitCode=0 Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.972223 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" event={"ID":"eeaee9da-afe7-4854-95a3-ac91aeac850e","Type":"ContainerDied","Data":"9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.972267 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" event={"ID":"eeaee9da-afe7-4854-95a3-ac91aeac850e","Type":"ContainerStarted","Data":"de826fc7ee84a301aa4432579ff102902b1c852b020a29cbaaf175771cc60706"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.978590 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerStarted","Data":"bbfe07f5894f58f5d90c81be72bf85dcfc65fb5c362fb57361bb9d99d21d2915"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.978619 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerStarted","Data":"ee4e1d6e8fbe16e6ab102b888f35c18efa8528d5916b3ccbffd6a334f5cdd9cd"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.982953 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-696f7ffc96-xhjxt" event={"ID":"e8ec980b-adab-4378-a632-0de5186250dd","Type":"ContainerStarted","Data":"bed5c4f3c3b49ff0e0a79acc5b990ea53ddfd09477e02bbde3be2527b6aca96b"} Sep 29 10:06:14 crc kubenswrapper[4891]: I0929 10:06:14.982982 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-696f7ffc96-xhjxt" event={"ID":"e8ec980b-adab-4378-a632-0de5186250dd","Type":"ContainerStarted","Data":"e58142794d49478db0b1f088d79787c6165bfc3585fc1e7e34eff7937c1244fd"} Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.724104 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917172 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917259 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917359 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917388 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917700 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917751 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tkzx\" (UniqueName: \"kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.917830 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data\") pod \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\" (UID: \"84c6c4b8-5f24-48db-884c-dd0669cb67cc\") " Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.918093 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.918351 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.918648 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.918661 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84c6c4b8-5f24-48db-884c-dd0669cb67cc-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.925041 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts" (OuterVolumeSpecName: "scripts") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.933095 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx" (OuterVolumeSpecName: "kube-api-access-5tkzx") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "kube-api-access-5tkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.961884 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.984701 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.991928 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data" (OuterVolumeSpecName: "config-data") pod "84c6c4b8-5f24-48db-884c-dd0669cb67cc" (UID: "84c6c4b8-5f24-48db-884c-dd0669cb67cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.996300 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" event={"ID":"eeaee9da-afe7-4854-95a3-ac91aeac850e","Type":"ContainerStarted","Data":"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00"} Sep 29 10:06:15 crc kubenswrapper[4891]: I0929 10:06:15.996605 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.005447 4891 generic.go:334] "Generic (PLEG): container finished" podID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerID="141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243" exitCode=0 Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.005531 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerDied","Data":"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.005565 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84c6c4b8-5f24-48db-884c-dd0669cb67cc","Type":"ContainerDied","Data":"fd50780ba2d96f0284e2a6870e4918789821db24f6d1467c53bccdd0d21b4b15"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.005587 4891 scope.go:117] "RemoveContainer" containerID="9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.005766 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.010994 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerStarted","Data":"6218225394a322a6f76553e58b455fb8257dccf40460bd9388c6b777b244fd53"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.011674 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.011770 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.016278 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-696f7ffc96-xhjxt" event={"ID":"e8ec980b-adab-4378-a632-0de5186250dd","Type":"ContainerStarted","Data":"392b9497828d94d4ed2f20c3d278dffc575f3010aafbaaea5f91041177e53032"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.017033 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.017173 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.019862 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.019945 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tkzx\" (UniqueName: \"kubernetes.io/projected/84c6c4b8-5f24-48db-884c-dd0669cb67cc-kube-api-access-5tkzx\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.020049 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.020120 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.020203 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84c6c4b8-5f24-48db-884c-dd0669cb67cc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.021920 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff9945478-9v77b" event={"ID":"7efda6e6-9019-4909-96be-068496b2577f","Type":"ContainerStarted","Data":"9e474bff1be43b61615a68a79d49d58d3736d8974c0231ae0ba6a6c30aa69d73"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.022063 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff9945478-9v77b" event={"ID":"7efda6e6-9019-4909-96be-068496b2577f","Type":"ContainerStarted","Data":"ea9b5daf715e82d078d53adcb574ff842df1cb63ca9a17e77d4f5fa929f08d7a"} Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.022252 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.022361 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.033499 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" podStartSLOduration=9.033461396 podStartE2EDuration="9.033461396s" podCreationTimestamp="2025-09-29 10:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:16.031328773 +0000 UTC m=+1106.236497094" watchObservedRunningTime="2025-09-29 10:06:16.033461396 +0000 UTC m=+1106.238629717" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.066266 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podStartSLOduration=9.066228325 podStartE2EDuration="9.066228325s" podCreationTimestamp="2025-09-29 10:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:16.050000505 +0000 UTC m=+1106.255168836" watchObservedRunningTime="2025-09-29 10:06:16.066228325 +0000 UTC m=+1106.271396646" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.077684 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-696f7ffc96-xhjxt" podStartSLOduration=4.077661913 podStartE2EDuration="4.077661913s" podCreationTimestamp="2025-09-29 10:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:16.07484709 +0000 UTC m=+1106.280015421" watchObservedRunningTime="2025-09-29 10:06:16.077661913 +0000 UTC m=+1106.282830234" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.124229 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ff9945478-9v77b" podStartSLOduration=6.124202049 podStartE2EDuration="6.124202049s" podCreationTimestamp="2025-09-29 10:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:16.102468957 +0000 UTC m=+1106.307637298" watchObservedRunningTime="2025-09-29 10:06:16.124202049 +0000 UTC m=+1106.329370370" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.174940 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.183166 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.200092 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:16 crc kubenswrapper[4891]: E0929 10:06:16.200679 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-notification-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.200705 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-notification-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: E0929 10:06:16.200732 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="sg-core" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.200741 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="sg-core" Sep 29 10:06:16 crc kubenswrapper[4891]: E0929 10:06:16.200777 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-central-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.200848 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-central-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.201075 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="sg-core" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.201266 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-central-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.201283 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" containerName="ceilometer-notification-agent" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.208897 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.211908 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.211934 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.214062 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329525 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329619 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329689 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329709 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329733 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfnxb\" (UniqueName: \"kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329778 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.329829 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.408666 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c6c4b8-5f24-48db-884c-dd0669cb67cc" path="/var/lib/kubelet/pods/84c6c4b8-5f24-48db-884c-dd0669cb67cc/volumes" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431360 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431415 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431449 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfnxb\" (UniqueName: \"kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431479 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431511 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431625 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.431664 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.433116 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.433179 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.436766 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.437222 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.438533 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.452762 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.454458 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfnxb\" (UniqueName: \"kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb\") pod \"ceilometer-0\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " pod="openstack/ceilometer-0" Sep 29 10:06:16 crc kubenswrapper[4891]: I0929 10:06:16.524366 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.056770 4891 scope.go:117] "RemoveContainer" containerID="141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.090119 4891 scope.go:117] "RemoveContainer" containerID="5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.306360 4891 scope.go:117] "RemoveContainer" containerID="9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee" Sep 29 10:06:17 crc kubenswrapper[4891]: E0929 10:06:17.307608 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee\": container with ID starting with 9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee not found: ID does not exist" containerID="9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.307657 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee"} err="failed to get container status \"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee\": rpc error: code = NotFound desc = could not find container \"9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee\": container with ID starting with 9269e8a955768ee357f1164b2c49f489cfe419feb6e4330239fffef531b0d0ee not found: ID does not exist" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.307688 4891 scope.go:117] "RemoveContainer" containerID="141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243" Sep 29 10:06:17 crc kubenswrapper[4891]: E0929 10:06:17.308622 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243\": container with ID starting with 141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243 not found: ID does not exist" containerID="141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.308678 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243"} err="failed to get container status \"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243\": rpc error: code = NotFound desc = could not find container \"141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243\": container with ID starting with 141da6b5d814ce53857c42a262e0c77be1c0eebedc9c307e75563c7d76b21243 not found: ID does not exist" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.308712 4891 scope.go:117] "RemoveContainer" containerID="5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d" Sep 29 10:06:17 crc kubenswrapper[4891]: E0929 10:06:17.310300 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d\": container with ID starting with 5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d not found: ID does not exist" containerID="5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.310646 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d"} err="failed to get container status \"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d\": rpc error: code = NotFound desc = could not find container \"5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d\": container with ID starting with 5b7730f2941b04ddfedec2e88fff8b377d172319d98cc03b39f8f392b163219d not found: ID does not exist" Sep 29 10:06:17 crc kubenswrapper[4891]: I0929 10:06:17.450473 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:17 crc kubenswrapper[4891]: W0929 10:06:17.477747 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f76ad70_c0e6_424d_893b_634a7ab43070.slice/crio-b0347f5cfbafb1c74131bffc2cd8de89387457012e269df301012120ed4543fc WatchSource:0}: Error finding container b0347f5cfbafb1c74131bffc2cd8de89387457012e269df301012120ed4543fc: Status 404 returned error can't find the container with id b0347f5cfbafb1c74131bffc2cd8de89387457012e269df301012120ed4543fc Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.043836 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" event={"ID":"d8d04caa-6db6-41c2-bf9b-f5ed373e9799","Type":"ContainerStarted","Data":"6a5429bb1d336fb826b8377a68f531b9fb44d2f3ea081b98ed371b1a10cad7ad"} Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.044127 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" event={"ID":"d8d04caa-6db6-41c2-bf9b-f5ed373e9799","Type":"ContainerStarted","Data":"bc2a35d1506ab27566a43e1afd66116acb3be4f8e0ef122043d57b1a8fda0354"} Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.047210 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-677dd7cdbc-drpcv" event={"ID":"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99","Type":"ContainerStarted","Data":"f3551ba248fac10772db02b5245dc090c326873bf33fbb924170940d8d3c9b8f"} Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.047262 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-677dd7cdbc-drpcv" event={"ID":"e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99","Type":"ContainerStarted","Data":"74de6f8ba7395f271944cec279d184a800a30e740b3c4579af830743cff4f81f"} Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.059498 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerStarted","Data":"b0347f5cfbafb1c74131bffc2cd8de89387457012e269df301012120ed4543fc"} Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.080931 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-748f5656b6-pdpff" podStartSLOduration=7.832425914 podStartE2EDuration="11.080899774s" podCreationTimestamp="2025-09-29 10:06:07 +0000 UTC" firstStartedPulling="2025-09-29 10:06:13.847575605 +0000 UTC m=+1104.052743926" lastFinishedPulling="2025-09-29 10:06:17.096049455 +0000 UTC m=+1107.301217786" observedRunningTime="2025-09-29 10:06:18.07060809 +0000 UTC m=+1108.275776421" watchObservedRunningTime="2025-09-29 10:06:18.080899774 +0000 UTC m=+1108.286068135" Sep 29 10:06:18 crc kubenswrapper[4891]: I0929 10:06:18.112561 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-677dd7cdbc-drpcv" podStartSLOduration=8.256847062 podStartE2EDuration="11.112533979s" podCreationTimestamp="2025-09-29 10:06:07 +0000 UTC" firstStartedPulling="2025-09-29 10:06:14.239849393 +0000 UTC m=+1104.445017714" lastFinishedPulling="2025-09-29 10:06:17.09553631 +0000 UTC m=+1107.300704631" observedRunningTime="2025-09-29 10:06:18.097375771 +0000 UTC m=+1108.302544112" watchObservedRunningTime="2025-09-29 10:06:18.112533979 +0000 UTC m=+1108.317702300" Sep 29 10:06:19 crc kubenswrapper[4891]: I0929 10:06:19.075538 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerStarted","Data":"79ddc933223ca1af0fb495bb94e91b022f14f4c2d7d7e7b778ee09f62f412916"} Sep 29 10:06:20 crc kubenswrapper[4891]: I0929 10:06:20.092025 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerStarted","Data":"9671bec061cc2461570459d1a75b20f020b9d721b46bf1564485d784f013575b"} Sep 29 10:06:20 crc kubenswrapper[4891]: I0929 10:06:20.093018 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerStarted","Data":"7b7bedd4dece6c6040fff107a8f521c915de625d5e7778d03dcb10a07ca00474"} Sep 29 10:06:20 crc kubenswrapper[4891]: I0929 10:06:20.170759 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:06:20 crc kubenswrapper[4891]: I0929 10:06:20.885547 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b9ccf6696-q5jmg" Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.196252 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerStarted","Data":"fad7f33305a324a7601d3f1518d1f8834a41c85b0d4f168c1ed37594fa934615"} Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.196436 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.221267 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.340215844 podStartE2EDuration="6.221249465s" podCreationTimestamp="2025-09-29 10:06:16 +0000 UTC" firstStartedPulling="2025-09-29 10:06:17.480098262 +0000 UTC m=+1107.685266593" lastFinishedPulling="2025-09-29 10:06:21.361131883 +0000 UTC m=+1111.566300214" observedRunningTime="2025-09-29 10:06:22.214685111 +0000 UTC m=+1112.419853462" watchObservedRunningTime="2025-09-29 10:06:22.221249465 +0000 UTC m=+1112.426417786" Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.863062 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.937773 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:06:22 crc kubenswrapper[4891]: I0929 10:06:22.938100 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="dnsmasq-dns" containerID="cri-o://ad342bf4983501bf4838c89a78fd3408436b7bbd7c8e4311706a8ad829a18622" gracePeriod=10 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.021442 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.172040 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff9945478-9v77b" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.227333 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hgm8s" event={"ID":"f7dd6438-e338-4dce-b2be-0e36b359631c","Type":"ContainerStarted","Data":"c28443f9e7a2b71f9692eaaa2c85a35d227f85bfaefc93933ca38d04d8f8d31a"} Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.241050 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.241600 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" containerID="cri-o://bbfe07f5894f58f5d90c81be72bf85dcfc65fb5c362fb57361bb9d99d21d2915" gracePeriod=30 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.241714 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" containerID="cri-o://6218225394a322a6f76553e58b455fb8257dccf40460bd9388c6b777b244fd53" gracePeriod=30 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.245279 4891 generic.go:334] "Generic (PLEG): container finished" podID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerID="ad342bf4983501bf4838c89a78fd3408436b7bbd7c8e4311706a8ad829a18622" exitCode=0 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.246434 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" event={"ID":"7e2bae28-f861-42b0-8fff-59b6516f85ff","Type":"ContainerDied","Data":"ad342bf4983501bf4838c89a78fd3408436b7bbd7c8e4311706a8ad829a18622"} Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.259895 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.260283 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.260357 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.268137 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hgm8s" podStartSLOduration=3.401747318 podStartE2EDuration="43.268110269s" podCreationTimestamp="2025-09-29 10:05:40 +0000 UTC" firstStartedPulling="2025-09-29 10:05:41.368877361 +0000 UTC m=+1071.574045672" lastFinishedPulling="2025-09-29 10:06:21.235240302 +0000 UTC m=+1111.440408623" observedRunningTime="2025-09-29 10:06:23.260684879 +0000 UTC m=+1113.465853200" watchObservedRunningTime="2025-09-29 10:06:23.268110269 +0000 UTC m=+1113.473278590" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.281551 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.281633 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.281754 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.630164 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.664345 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56d6cd75c7-6j75x" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.756574 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.756859 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f8b757f88-7hxnh" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-api" containerID="cri-o://b670472af43fb7a0fe22de0471be6eaf4c651568002cb2dac21913a7fed90a6f" gracePeriod=30 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.757235 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f8b757f88-7hxnh" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-httpd" containerID="cri-o://b2f500907188a05e44bfb23e95e794f24e6465960eddc5cf8ab4ec28d770445a" gracePeriod=30 Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.757860 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2rq\" (UniqueName: \"kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.757965 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.758041 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.758168 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.758250 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.758301 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0\") pod \"7e2bae28-f861-42b0-8fff-59b6516f85ff\" (UID: \"7e2bae28-f861-42b0-8fff-59b6516f85ff\") " Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.776722 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 10:06:23 crc kubenswrapper[4891]: E0929 10:06:23.788343 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="init" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.788379 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="init" Sep 29 10:06:23 crc kubenswrapper[4891]: E0929 10:06:23.788396 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="dnsmasq-dns" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.788403 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="dnsmasq-dns" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.788604 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" containerName="dnsmasq-dns" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.789552 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.803159 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.803367 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.808057 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq" (OuterVolumeSpecName: "kube-api-access-mh2rq") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "kube-api-access-mh2rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.818686 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.854542 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-w7zns" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.860939 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.860997 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlstr\" (UniqueName: \"kubernetes.io/projected/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-kube-api-access-qlstr\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.863853 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.864084 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.864330 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh2rq\" (UniqueName: \"kubernetes.io/projected/7e2bae28-f861-42b0-8fff-59b6516f85ff-kube-api-access-mh2rq\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.919390 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.920582 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.933343 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.956446 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969087 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969139 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlstr\" (UniqueName: \"kubernetes.io/projected/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-kube-api-access-qlstr\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969160 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969243 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969322 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969333 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969343 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.969351 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.975921 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.976284 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:23 crc kubenswrapper[4891]: I0929 10:06:23.976283 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config" (OuterVolumeSpecName: "config") pod "7e2bae28-f861-42b0-8fff-59b6516f85ff" (UID: "7e2bae28-f861-42b0-8fff-59b6516f85ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.007429 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.017912 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlstr\" (UniqueName: \"kubernetes.io/projected/d3e0b825-0c6a-49ed-bb87-097ab0e686ee-kube-api-access-qlstr\") pod \"openstackclient\" (UID: \"d3e0b825-0c6a-49ed-bb87-097ab0e686ee\") " pod="openstack/openstackclient" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.073218 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bae28-f861-42b0-8fff-59b6516f85ff-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.248332 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.330109 4891 generic.go:334] "Generic (PLEG): container finished" podID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerID="bbfe07f5894f58f5d90c81be72bf85dcfc65fb5c362fb57361bb9d99d21d2915" exitCode=143 Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.330203 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerDied","Data":"bbfe07f5894f58f5d90c81be72bf85dcfc65fb5c362fb57361bb9d99d21d2915"} Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.347968 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" event={"ID":"7e2bae28-f861-42b0-8fff-59b6516f85ff","Type":"ContainerDied","Data":"f8bcb9f25f9c1e7b13decbef5b11e131ea130fe869b43d81a5de1bcd0a0d01fa"} Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.348025 4891 scope.go:117] "RemoveContainer" containerID="ad342bf4983501bf4838c89a78fd3408436b7bbd7c8e4311706a8ad829a18622" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.348187 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-qjfvh" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.398262 4891 generic.go:334] "Generic (PLEG): container finished" podID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerID="b2f500907188a05e44bfb23e95e794f24e6465960eddc5cf8ab4ec28d770445a" exitCode=0 Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.398474 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerDied","Data":"b2f500907188a05e44bfb23e95e794f24e6465960eddc5cf8ab4ec28d770445a"} Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.477164 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.500915 4891 scope.go:117] "RemoveContainer" containerID="37ae262c85e8f74c37d5ca8a66a0cec073822408fdf83d62c18a9ebc0e42fe90" Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.529160 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-qjfvh"] Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.581088 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Sep 29 10:06:24 crc kubenswrapper[4891]: W0929 10:06:24.925585 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e0b825_0c6a_49ed_bb87_097ab0e686ee.slice/crio-4b37b72828806198512249e4ff7d5b0624fe283c7a5563feddff5e0a780f7061 WatchSource:0}: Error finding container 4b37b72828806198512249e4ff7d5b0624fe283c7a5563feddff5e0a780f7061: Status 404 returned error can't find the container with id 4b37b72828806198512249e4ff7d5b0624fe283c7a5563feddff5e0a780f7061 Sep 29 10:06:24 crc kubenswrapper[4891]: I0929 10:06:24.927265 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 10:06:25 crc kubenswrapper[4891]: I0929 10:06:25.410778 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d3e0b825-0c6a-49ed-bb87-097ab0e686ee","Type":"ContainerStarted","Data":"4b37b72828806198512249e4ff7d5b0624fe283c7a5563feddff5e0a780f7061"} Sep 29 10:06:26 crc kubenswrapper[4891]: I0929 10:06:26.408724 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2bae28-f861-42b0-8fff-59b6516f85ff" path="/var/lib/kubelet/pods/7e2bae28-f861-42b0-8fff-59b6516f85ff/volumes" Sep 29 10:06:30 crc kubenswrapper[4891]: I0929 10:06:30.694967 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:38854->10.217.0.161:9311: read: connection reset by peer" Sep 29 10:06:30 crc kubenswrapper[4891]: I0929 10:06:30.694993 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:38852->10.217.0.161:9311: read: connection reset by peer" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.268006 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.268885 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="proxy-httpd" containerID="cri-o://fad7f33305a324a7601d3f1518d1f8834a41c85b0d4f168c1ed37594fa934615" gracePeriod=30 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.269057 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-notification-agent" containerID="cri-o://7b7bedd4dece6c6040fff107a8f521c915de625d5e7778d03dcb10a07ca00474" gracePeriod=30 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.269058 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="sg-core" containerID="cri-o://9671bec061cc2461570459d1a75b20f020b9d721b46bf1564485d784f013575b" gracePeriod=30 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.273029 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-central-agent" containerID="cri-o://79ddc933223ca1af0fb495bb94e91b022f14f4c2d7d7e7b778ee09f62f412916" gracePeriod=30 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.276635 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": EOF" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.477544 4891 generic.go:334] "Generic (PLEG): container finished" podID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerID="6218225394a322a6f76553e58b455fb8257dccf40460bd9388c6b777b244fd53" exitCode=0 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.477602 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerDied","Data":"6218225394a322a6f76553e58b455fb8257dccf40460bd9388c6b777b244fd53"} Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.481503 4891 generic.go:334] "Generic (PLEG): container finished" podID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerID="9671bec061cc2461570459d1a75b20f020b9d721b46bf1564485d784f013575b" exitCode=2 Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.481527 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerDied","Data":"9671bec061cc2461570459d1a75b20f020b9d721b46bf1564485d784f013575b"} Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.528446 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7df6f5468c-2kcvk"] Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.531749 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.535664 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.535891 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.536233 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.554429 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7df6f5468c-2kcvk"] Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667144 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfq5t\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-kube-api-access-vfq5t\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667191 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-config-data\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667270 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-run-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667372 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-internal-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667421 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-combined-ca-bundle\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667453 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-log-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667485 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-public-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.667514 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-etc-swift\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.769667 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-run-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.769730 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-internal-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.769784 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-combined-ca-bundle\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.769831 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-log-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.769892 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-public-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.770408 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-log-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.770483 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-etc-swift\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.770525 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfq5t\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-kube-api-access-vfq5t\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.770550 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-config-data\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.771317 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82a9d505-81c4-410a-9707-adb83f47f425-run-httpd\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.777694 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-config-data\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.777806 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-internal-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.777984 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-etc-swift\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.780628 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-public-tls-certs\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.787973 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9d505-81c4-410a-9707-adb83f47f425-combined-ca-bundle\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.789606 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfq5t\" (UniqueName: \"kubernetes.io/projected/82a9d505-81c4-410a-9707-adb83f47f425-kube-api-access-vfq5t\") pod \"swift-proxy-7df6f5468c-2kcvk\" (UID: \"82a9d505-81c4-410a-9707-adb83f47f425\") " pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:31 crc kubenswrapper[4891]: I0929 10:06:31.854003 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497058 4891 generic.go:334] "Generic (PLEG): container finished" podID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerID="fad7f33305a324a7601d3f1518d1f8834a41c85b0d4f168c1ed37594fa934615" exitCode=0 Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497104 4891 generic.go:334] "Generic (PLEG): container finished" podID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerID="7b7bedd4dece6c6040fff107a8f521c915de625d5e7778d03dcb10a07ca00474" exitCode=0 Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497114 4891 generic.go:334] "Generic (PLEG): container finished" podID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerID="79ddc933223ca1af0fb495bb94e91b022f14f4c2d7d7e7b778ee09f62f412916" exitCode=0 Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497133 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerDied","Data":"fad7f33305a324a7601d3f1518d1f8834a41c85b0d4f168c1ed37594fa934615"} Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497158 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerDied","Data":"7b7bedd4dece6c6040fff107a8f521c915de625d5e7778d03dcb10a07ca00474"} Sep 29 10:06:32 crc kubenswrapper[4891]: I0929 10:06:32.497171 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerDied","Data":"79ddc933223ca1af0fb495bb94e91b022f14f4c2d7d7e7b778ee09f62f412916"} Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.025488 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: connect: connection refused" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.025899 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: connect: connection refused" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.553480 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bbtmc"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.555662 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.578696 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bbtmc"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.660649 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kzk72"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.662360 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.676267 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kzk72"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.716964 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hr2x\" (UniqueName: \"kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x\") pod \"nova-api-db-create-bbtmc\" (UID: \"0718b8fe-fc19-46ae-8c7f-bdd9908c9730\") " pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.768620 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p9kt4"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.770152 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.776678 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p9kt4"] Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.819775 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rhc\" (UniqueName: \"kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc\") pod \"nova-cell0-db-create-kzk72\" (UID: \"fc047cc9-f61d-4575-a8bd-d6b69aa77701\") " pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.819884 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hr2x\" (UniqueName: \"kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x\") pod \"nova-api-db-create-bbtmc\" (UID: \"0718b8fe-fc19-46ae-8c7f-bdd9908c9730\") " pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.863960 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hr2x\" (UniqueName: \"kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x\") pod \"nova-api-db-create-bbtmc\" (UID: \"0718b8fe-fc19-46ae-8c7f-bdd9908c9730\") " pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.893542 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.922412 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qfn\" (UniqueName: \"kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn\") pod \"nova-cell1-db-create-p9kt4\" (UID: \"11722413-5550-4d6a-b328-f57f26166791\") " pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.922544 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rhc\" (UniqueName: \"kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc\") pod \"nova-cell0-db-create-kzk72\" (UID: \"fc047cc9-f61d-4575-a8bd-d6b69aa77701\") " pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.952253 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rhc\" (UniqueName: \"kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc\") pod \"nova-cell0-db-create-kzk72\" (UID: \"fc047cc9-f61d-4575-a8bd-d6b69aa77701\") " pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:33 crc kubenswrapper[4891]: I0929 10:06:33.986558 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:34 crc kubenswrapper[4891]: I0929 10:06:34.024193 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qfn\" (UniqueName: \"kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn\") pod \"nova-cell1-db-create-p9kt4\" (UID: \"11722413-5550-4d6a-b328-f57f26166791\") " pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:34 crc kubenswrapper[4891]: I0929 10:06:34.045328 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qfn\" (UniqueName: \"kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn\") pod \"nova-cell1-db-create-p9kt4\" (UID: \"11722413-5550-4d6a-b328-f57f26166791\") " pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:34 crc kubenswrapper[4891]: I0929 10:06:34.100543 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:34 crc kubenswrapper[4891]: I0929 10:06:34.582815 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7975d54bd8-pl4st" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Sep 29 10:06:35 crc kubenswrapper[4891]: I0929 10:06:35.549307 4891 generic.go:334] "Generic (PLEG): container finished" podID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerID="b670472af43fb7a0fe22de0471be6eaf4c651568002cb2dac21913a7fed90a6f" exitCode=0 Sep 29 10:06:35 crc kubenswrapper[4891]: I0929 10:06:35.549411 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerDied","Data":"b670472af43fb7a0fe22de0471be6eaf4c651568002cb2dac21913a7fed90a6f"} Sep 29 10:06:35 crc kubenswrapper[4891]: I0929 10:06:35.552617 4891 generic.go:334] "Generic (PLEG): container finished" podID="f7dd6438-e338-4dce-b2be-0e36b359631c" containerID="c28443f9e7a2b71f9692eaaa2c85a35d227f85bfaefc93933ca38d04d8f8d31a" exitCode=0 Sep 29 10:06:35 crc kubenswrapper[4891]: I0929 10:06:35.552690 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hgm8s" event={"ID":"f7dd6438-e338-4dce-b2be-0e36b359631c","Type":"ContainerDied","Data":"c28443f9e7a2b71f9692eaaa2c85a35d227f85bfaefc93933ca38d04d8f8d31a"} Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.412803 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.575617 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d3e0b825-0c6a-49ed-bb87-097ab0e686ee","Type":"ContainerStarted","Data":"8fb1ef0deebdefb4d0d8bc2f8ab24c4bda4d9fa0951276564724c82d12b1cc0a"} Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.586881 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587022 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587053 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587152 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587201 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587253 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfnxb\" (UniqueName: \"kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.587362 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts\") pod \"0f76ad70-c0e6-424d-893b-634a7ab43070\" (UID: \"0f76ad70-c0e6-424d-893b-634a7ab43070\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.588060 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.588496 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.593068 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.594059 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f76ad70-c0e6-424d-893b-634a7ab43070","Type":"ContainerDied","Data":"b0347f5cfbafb1c74131bffc2cd8de89387457012e269df301012120ed4543fc"} Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.594119 4891 scope.go:117] "RemoveContainer" containerID="fad7f33305a324a7601d3f1518d1f8834a41c85b0d4f168c1ed37594fa934615" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.618300 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb" (OuterVolumeSpecName: "kube-api-access-cfnxb") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "kube-api-access-cfnxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.623277 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.624127 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts" (OuterVolumeSpecName: "scripts") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.629177 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.441097759 podStartE2EDuration="13.629154085s" podCreationTimestamp="2025-09-29 10:06:23 +0000 UTC" firstStartedPulling="2025-09-29 10:06:24.93009074 +0000 UTC m=+1115.135259061" lastFinishedPulling="2025-09-29 10:06:36.118147046 +0000 UTC m=+1126.323315387" observedRunningTime="2025-09-29 10:06:36.610842974 +0000 UTC m=+1126.816011345" watchObservedRunningTime="2025-09-29 10:06:36.629154085 +0000 UTC m=+1126.834322406" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.637044 4891 scope.go:117] "RemoveContainer" containerID="9671bec061cc2461570459d1a75b20f020b9d721b46bf1564485d784f013575b" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.659378 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.675619 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.683051 4891 scope.go:117] "RemoveContainer" containerID="7b7bedd4dece6c6040fff107a8f521c915de625d5e7778d03dcb10a07ca00474" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.693030 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.693069 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfnxb\" (UniqueName: \"kubernetes.io/projected/0f76ad70-c0e6-424d-893b-634a7ab43070-kube-api-access-cfnxb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.693082 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.693093 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.693106 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f76ad70-c0e6-424d-893b-634a7ab43070-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.744397 4891 scope.go:117] "RemoveContainer" containerID="79ddc933223ca1af0fb495bb94e91b022f14f4c2d7d7e7b778ee09f62f412916" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.745232 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.754037 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data" (OuterVolumeSpecName: "config-data") pod "0f76ad70-c0e6-424d-893b-634a7ab43070" (UID: "0f76ad70-c0e6-424d-893b-634a7ab43070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794204 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs\") pod \"adcba488-cf27-4f33-8046-52a1f2c92b9b\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794289 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs\") pod \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794306 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom\") pod \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794355 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config\") pod \"adcba488-cf27-4f33-8046-52a1f2c92b9b\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794449 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gt9t\" (UniqueName: \"kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t\") pod \"adcba488-cf27-4f33-8046-52a1f2c92b9b\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794563 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle\") pod \"adcba488-cf27-4f33-8046-52a1f2c92b9b\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794590 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle\") pod \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794611 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzk6\" (UniqueName: \"kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6\") pod \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794678 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data\") pod \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\" (UID: \"cb3bb277-4352-4266-a5f1-9fbb8ea07eed\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794702 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config\") pod \"adcba488-cf27-4f33-8046-52a1f2c92b9b\" (UID: \"adcba488-cf27-4f33-8046-52a1f2c92b9b\") " Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.794876 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs" (OuterVolumeSpecName: "logs") pod "cb3bb277-4352-4266-a5f1-9fbb8ea07eed" (UID: "cb3bb277-4352-4266-a5f1-9fbb8ea07eed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.795151 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.795165 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f76ad70-c0e6-424d-893b-634a7ab43070-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.795174 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.799730 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "adcba488-cf27-4f33-8046-52a1f2c92b9b" (UID: "adcba488-cf27-4f33-8046-52a1f2c92b9b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.800254 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t" (OuterVolumeSpecName: "kube-api-access-2gt9t") pod "adcba488-cf27-4f33-8046-52a1f2c92b9b" (UID: "adcba488-cf27-4f33-8046-52a1f2c92b9b"). InnerVolumeSpecName "kube-api-access-2gt9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.801249 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb3bb277-4352-4266-a5f1-9fbb8ea07eed" (UID: "cb3bb277-4352-4266-a5f1-9fbb8ea07eed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.802242 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6" (OuterVolumeSpecName: "kube-api-access-vfzk6") pod "cb3bb277-4352-4266-a5f1-9fbb8ea07eed" (UID: "cb3bb277-4352-4266-a5f1-9fbb8ea07eed"). InnerVolumeSpecName "kube-api-access-vfzk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.823403 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb3bb277-4352-4266-a5f1-9fbb8ea07eed" (UID: "cb3bb277-4352-4266-a5f1-9fbb8ea07eed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.866915 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data" (OuterVolumeSpecName: "config-data") pod "cb3bb277-4352-4266-a5f1-9fbb8ea07eed" (UID: "cb3bb277-4352-4266-a5f1-9fbb8ea07eed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.876607 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "adcba488-cf27-4f33-8046-52a1f2c92b9b" (UID: "adcba488-cf27-4f33-8046-52a1f2c92b9b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.889898 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adcba488-cf27-4f33-8046-52a1f2c92b9b" (UID: "adcba488-cf27-4f33-8046-52a1f2c92b9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900133 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config" (OuterVolumeSpecName: "config") pod "adcba488-cf27-4f33-8046-52a1f2c92b9b" (UID: "adcba488-cf27-4f33-8046-52a1f2c92b9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900529 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900550 4891 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900563 4891 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900573 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900583 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gt9t\" (UniqueName: \"kubernetes.io/projected/adcba488-cf27-4f33-8046-52a1f2c92b9b-kube-api-access-2gt9t\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900593 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcba488-cf27-4f33-8046-52a1f2c92b9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900603 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900612 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzk6\" (UniqueName: \"kubernetes.io/projected/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-kube-api-access-vfzk6\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:36 crc kubenswrapper[4891]: I0929 10:06:36.900620 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3bb277-4352-4266-a5f1-9fbb8ea07eed-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.046917 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p9kt4"] Sep 29 10:06:37 crc kubenswrapper[4891]: W0929 10:06:37.063170 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11722413_5550_4d6a_b328_f57f26166791.slice/crio-c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7 WatchSource:0}: Error finding container c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7: Status 404 returned error can't find the container with id c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7 Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.066058 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bbtmc"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.073123 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kzk72"] Sep 29 10:06:37 crc kubenswrapper[4891]: W0929 10:06:37.080947 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0718b8fe_fc19_46ae_8c7f_bdd9908c9730.slice/crio-73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784 WatchSource:0}: Error finding container 73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784: Status 404 returned error can't find the container with id 73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784 Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.148910 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.162060 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.176968 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.187358 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7df6f5468c-2kcvk"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.224358 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225044 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" containerName="cinder-db-sync" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225063 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" containerName="cinder-db-sync" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225079 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225090 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225105 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225112 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225140 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="proxy-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225148 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="proxy-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225163 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225171 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225189 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-api" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225197 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-api" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225205 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-notification-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225212 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-notification-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225234 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="sg-core" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225242 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="sg-core" Sep 29 10:06:37 crc kubenswrapper[4891]: E0929 10:06:37.225262 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-central-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.225311 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-central-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.227950 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="sg-core" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.227975 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api-log" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.227987 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="proxy-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228000 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-notification-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228011 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" containerName="ceilometer-central-agent" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228030 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-api" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228040 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" containerName="barbican-api" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228053 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" containerName="cinder-db-sync" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.228064 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" containerName="neutron-httpd" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.233922 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.234032 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.236681 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.237017 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.312852 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.312950 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313044 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313070 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313171 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313201 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-456p7\" (UniqueName: \"kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313227 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts\") pod \"f7dd6438-e338-4dce-b2be-0e36b359631c\" (UID: \"f7dd6438-e338-4dce-b2be-0e36b359631c\") " Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.313663 4891 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7dd6438-e338-4dce-b2be-0e36b359631c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.324317 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7" (OuterVolumeSpecName: "kube-api-access-456p7") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "kube-api-access-456p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.325535 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts" (OuterVolumeSpecName: "scripts") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.330697 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.355992 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.390441 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data" (OuterVolumeSpecName: "config-data") pod "f7dd6438-e338-4dce-b2be-0e36b359631c" (UID: "f7dd6438-e338-4dce-b2be-0e36b359631c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415277 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415357 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415403 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415430 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415458 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415491 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415518 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942zl\" (UniqueName: \"kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415588 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415601 4891 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415610 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415618 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-456p7\" (UniqueName: \"kubernetes.io/projected/f7dd6438-e338-4dce-b2be-0e36b359631c-kube-api-access-456p7\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.415629 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7dd6438-e338-4dce-b2be-0e36b359631c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517026 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517134 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517184 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517207 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517251 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517296 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.517321 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942zl\" (UniqueName: \"kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.518470 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.520227 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.523243 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.524076 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.524345 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.524658 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.536760 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942zl\" (UniqueName: \"kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl\") pod \"ceilometer-0\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.587077 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.613610 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8b757f88-7hxnh" event={"ID":"adcba488-cf27-4f33-8046-52a1f2c92b9b","Type":"ContainerDied","Data":"679740f3aae0740c12f0a2a8a0cd94b4e432284320733ba79ec2375d25607f35"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.613676 4891 scope.go:117] "RemoveContainer" containerID="b2f500907188a05e44bfb23e95e794f24e6465960eddc5cf8ab4ec28d770445a" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.613881 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8b757f88-7hxnh" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.617585 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df6f5468c-2kcvk" event={"ID":"82a9d505-81c4-410a-9707-adb83f47f425","Type":"ContainerStarted","Data":"e8905237210610728224cfa9026ba90a6825c3d1600800ca2e74d00b020c2300"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.617618 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df6f5468c-2kcvk" event={"ID":"82a9d505-81c4-410a-9707-adb83f47f425","Type":"ContainerStarted","Data":"e5233f095c54f4e6d491ce236a569d5588bd0a1f77c96610e6760da5210a7f0a"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.629724 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" event={"ID":"cb3bb277-4352-4266-a5f1-9fbb8ea07eed","Type":"ContainerDied","Data":"ee4e1d6e8fbe16e6ab102b888f35c18efa8528d5916b3ccbffd6a334f5cdd9cd"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.629931 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbdbf46b6-mv6rb" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.635694 4891 generic.go:334] "Generic (PLEG): container finished" podID="0718b8fe-fc19-46ae-8c7f-bdd9908c9730" containerID="3cf5c0f51126bb73d321dd15082b5342d17bee079ca62c8cc243358139c5e88d" exitCode=0 Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.635864 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bbtmc" event={"ID":"0718b8fe-fc19-46ae-8c7f-bdd9908c9730","Type":"ContainerDied","Data":"3cf5c0f51126bb73d321dd15082b5342d17bee079ca62c8cc243358139c5e88d"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.636071 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bbtmc" event={"ID":"0718b8fe-fc19-46ae-8c7f-bdd9908c9730","Type":"ContainerStarted","Data":"73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.655125 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hgm8s" event={"ID":"f7dd6438-e338-4dce-b2be-0e36b359631c","Type":"ContainerDied","Data":"9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.655152 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hgm8s" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.655192 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad034a361cf152236a509595cced45e1bc679d65914c13e5eebb429556a1e62" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.689996 4891 scope.go:117] "RemoveContainer" containerID="b670472af43fb7a0fe22de0471be6eaf4c651568002cb2dac21913a7fed90a6f" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.696384 4891 generic.go:334] "Generic (PLEG): container finished" podID="fc047cc9-f61d-4575-a8bd-d6b69aa77701" containerID="fc2abc4dd9c52bfb8d12a419e65f4e47289e232baab2282213e00b71966e8fe1" exitCode=0 Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.696466 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kzk72" event={"ID":"fc047cc9-f61d-4575-a8bd-d6b69aa77701","Type":"ContainerDied","Data":"fc2abc4dd9c52bfb8d12a419e65f4e47289e232baab2282213e00b71966e8fe1"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.696493 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kzk72" event={"ID":"fc047cc9-f61d-4575-a8bd-d6b69aa77701","Type":"ContainerStarted","Data":"65444e306e066b577dd2375e1ca8b048fada75525cd0fdf52ddef85450055137"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.713520 4891 generic.go:334] "Generic (PLEG): container finished" podID="11722413-5550-4d6a-b328-f57f26166791" containerID="31e52820b7dec217715b8c9fb4934b0d20c89e9419a82a26be2c31565a0a1edb" exitCode=0 Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.714463 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p9kt4" event={"ID":"11722413-5550-4d6a-b328-f57f26166791","Type":"ContainerDied","Data":"31e52820b7dec217715b8c9fb4934b0d20c89e9419a82a26be2c31565a0a1edb"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.715679 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p9kt4" event={"ID":"11722413-5550-4d6a-b328-f57f26166791","Type":"ContainerStarted","Data":"c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7"} Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.744014 4891 scope.go:117] "RemoveContainer" containerID="6218225394a322a6f76553e58b455fb8257dccf40460bd9388c6b777b244fd53" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.744158 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.769480 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f8b757f88-7hxnh"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.829144 4891 scope.go:117] "RemoveContainer" containerID="bbfe07f5894f58f5d90c81be72bf85dcfc65fb5c362fb57361bb9d99d21d2915" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.873928 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.895656 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bbdbf46b6-mv6rb"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.957852 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.961074 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.967062 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.967893 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.967981 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.968089 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jz6qc" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.968166 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.983203 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.986999 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:37 crc kubenswrapper[4891]: I0929 10:06:37.998647 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.064414 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.066552 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.071196 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.076140 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.144427 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnzv\" (UniqueName: \"kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.144567 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.144720 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.144804 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.144927 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145064 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145159 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145276 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145368 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145437 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145514 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfdr\" (UniqueName: \"kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.145585 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.247122 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.247230 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.247318 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248145 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248294 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248339 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248377 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248395 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248423 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfdr\" (UniqueName: \"kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248465 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncc7x\" (UniqueName: \"kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248490 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248529 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnzv\" (UniqueName: \"kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248547 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248576 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248607 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248628 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248694 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248721 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248741 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.248763 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.250054 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.250104 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.250606 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.251210 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.256741 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.258960 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.268609 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.272704 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.278397 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnzv\" (UniqueName: \"kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv\") pod \"dnsmasq-dns-6bb4fc677f-jjh8f\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.280283 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfdr\" (UniqueName: \"kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.293612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.313401 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.352901 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncc7x\" (UniqueName: \"kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.352983 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353017 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353034 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353077 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353100 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353147 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.353327 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.354251 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.360315 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.366332 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.366991 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.370313 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.386747 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.387417 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncc7x\" (UniqueName: \"kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x\") pod \"cinder-api-0\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.407328 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.409262 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f76ad70-c0e6-424d-893b-634a7ab43070" path="/var/lib/kubelet/pods/0f76ad70-c0e6-424d-893b-634a7ab43070/volumes" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.410225 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adcba488-cf27-4f33-8046-52a1f2c92b9b" path="/var/lib/kubelet/pods/adcba488-cf27-4f33-8046-52a1f2c92b9b/volumes" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.410849 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3bb277-4352-4266-a5f1-9fbb8ea07eed" path="/var/lib/kubelet/pods/cb3bb277-4352-4266-a5f1-9fbb8ea07eed/volumes" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.589528 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.746334 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerStarted","Data":"fca0246e3c86aebbc4f681384df8b448adb73e73120c34fb54475641990303f1"} Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.763776 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df6f5468c-2kcvk" event={"ID":"82a9d505-81c4-410a-9707-adb83f47f425","Type":"ContainerStarted","Data":"3a15ea3c060b47e243eb4aca91dac41c541f8bdb43cc1f02fa2b0531832d74a9"} Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.764032 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.764059 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.805468 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7df6f5468c-2kcvk" podStartSLOduration=7.805447304 podStartE2EDuration="7.805447304s" podCreationTimestamp="2025-09-29 10:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:38.794436769 +0000 UTC m=+1128.999605090" watchObservedRunningTime="2025-09-29 10:06:38.805447304 +0000 UTC m=+1129.010615625" Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.813056 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:06:38 crc kubenswrapper[4891]: I0929 10:06:38.972441 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.271247 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.290834 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.386961 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hr2x\" (UniqueName: \"kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x\") pod \"0718b8fe-fc19-46ae-8c7f-bdd9908c9730\" (UID: \"0718b8fe-fc19-46ae-8c7f-bdd9908c9730\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.391808 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x" (OuterVolumeSpecName: "kube-api-access-8hr2x") pod "0718b8fe-fc19-46ae-8c7f-bdd9908c9730" (UID: "0718b8fe-fc19-46ae-8c7f-bdd9908c9730"). InnerVolumeSpecName "kube-api-access-8hr2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.462611 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.490721 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qfn\" (UniqueName: \"kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn\") pod \"11722413-5550-4d6a-b328-f57f26166791\" (UID: \"11722413-5550-4d6a-b328-f57f26166791\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.492910 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hr2x\" (UniqueName: \"kubernetes.io/projected/0718b8fe-fc19-46ae-8c7f-bdd9908c9730-kube-api-access-8hr2x\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.496847 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn" (OuterVolumeSpecName: "kube-api-access-d6qfn") pod "11722413-5550-4d6a-b328-f57f26166791" (UID: "11722413-5550-4d6a-b328-f57f26166791"). InnerVolumeSpecName "kube-api-access-d6qfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.581439 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.595965 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.596026 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.596071 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.596198 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22cz4\" (UniqueName: \"kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.601541 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.601627 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rhc\" (UniqueName: \"kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc\") pod \"fc047cc9-f61d-4575-a8bd-d6b69aa77701\" (UID: \"fc047cc9-f61d-4575-a8bd-d6b69aa77701\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.601692 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.601748 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs\") pod \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\" (UID: \"4cfbacc9-ec34-4515-9874-1fd082cdbea3\") " Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.604007 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qfn\" (UniqueName: \"kubernetes.io/projected/11722413-5550-4d6a-b328-f57f26166791-kube-api-access-d6qfn\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.604395 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs" (OuterVolumeSpecName: "logs") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.621224 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.621348 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc" (OuterVolumeSpecName: "kube-api-access-99rhc") pod "fc047cc9-f61d-4575-a8bd-d6b69aa77701" (UID: "fc047cc9-f61d-4575-a8bd-d6b69aa77701"). InnerVolumeSpecName "kube-api-access-99rhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.634808 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts" (OuterVolumeSpecName: "scripts") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.642535 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4" (OuterVolumeSpecName: "kube-api-access-22cz4") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "kube-api-access-22cz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.647668 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:39 crc kubenswrapper[4891]: W0929 10:06:39.664763 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod803e2d14_8309_4637_9f99_d0903e7ac08e.slice/crio-c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e WatchSource:0}: Error finding container c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e: Status 404 returned error can't find the container with id c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.676807 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.693756 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.700551 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data" (OuterVolumeSpecName: "config-data") pod "4cfbacc9-ec34-4515-9874-1fd082cdbea3" (UID: "4cfbacc9-ec34-4515-9874-1fd082cdbea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706659 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rhc\" (UniqueName: \"kubernetes.io/projected/fc047cc9-f61d-4575-a8bd-d6b69aa77701-kube-api-access-99rhc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706699 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706710 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfbacc9-ec34-4515-9874-1fd082cdbea3-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706723 4891 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706732 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706742 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfbacc9-ec34-4515-9874-1fd082cdbea3-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706751 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22cz4\" (UniqueName: \"kubernetes.io/projected/4cfbacc9-ec34-4515-9874-1fd082cdbea3-kube-api-access-22cz4\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.706760 4891 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4cfbacc9-ec34-4515-9874-1fd082cdbea3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.798441 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bbtmc" event={"ID":"0718b8fe-fc19-46ae-8c7f-bdd9908c9730","Type":"ContainerDied","Data":"73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.798486 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73faa6e86b9958a2b09cc79c2f8927c993fd288b1c78445171de93f95d448784" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.798583 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bbtmc" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.814594 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerStarted","Data":"f3385b43b3813904574c23b8b51608d73f349a3e0e6433edd0efac72b4ffdeab"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.822515 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerStarted","Data":"00a0c00e672be44db7f9c253db8f2f4593c14891ae402173cf865c3855759b0d"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.847088 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerStarted","Data":"c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.875072 4891 generic.go:334] "Generic (PLEG): container finished" podID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerID="6aa9d893d89c701ad32acee5980613f9e2a92e0643ffead4f8d1fa6089941c14" exitCode=0 Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.875141 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" event={"ID":"27917a33-23af-42a4-a85c-3bf8f1f9c1d0","Type":"ContainerDied","Data":"6aa9d893d89c701ad32acee5980613f9e2a92e0643ffead4f8d1fa6089941c14"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.875167 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" event={"ID":"27917a33-23af-42a4-a85c-3bf8f1f9c1d0","Type":"ContainerStarted","Data":"31c1766316c7f6b5da97b4049dd08372da38af4bde548ac50ee6e581a9a5cfe0"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.907412 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kzk72" event={"ID":"fc047cc9-f61d-4575-a8bd-d6b69aa77701","Type":"ContainerDied","Data":"65444e306e066b577dd2375e1ca8b048fada75525cd0fdf52ddef85450055137"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.907453 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65444e306e066b577dd2375e1ca8b048fada75525cd0fdf52ddef85450055137" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.907542 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kzk72" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.952088 4891 generic.go:334] "Generic (PLEG): container finished" podID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerID="a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6" exitCode=137 Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.952202 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerDied","Data":"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.952237 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975d54bd8-pl4st" event={"ID":"4cfbacc9-ec34-4515-9874-1fd082cdbea3","Type":"ContainerDied","Data":"33f370cf6b4c923624a71e57f568bd1cdea62ff07271dd7ea4b4997204c9e0d8"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.952258 4891 scope.go:117] "RemoveContainer" containerID="40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.952420 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975d54bd8-pl4st" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.993683 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p9kt4" Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.994822 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p9kt4" event={"ID":"11722413-5550-4d6a-b328-f57f26166791","Type":"ContainerDied","Data":"c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7"} Sep 29 10:06:39 crc kubenswrapper[4891]: I0929 10:06:39.994883 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7496cf27b2566a34d5136e5f0fdd7c0e9a832e12e1014a3074a913402fce1f7" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.149849 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.166740 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7975d54bd8-pl4st"] Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.343998 4891 scope.go:117] "RemoveContainer" containerID="a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.452612 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" path="/var/lib/kubelet/pods/4cfbacc9-ec34-4515-9874-1fd082cdbea3/volumes" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.458239 4891 scope.go:117] "RemoveContainer" containerID="40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865" Sep 29 10:06:40 crc kubenswrapper[4891]: E0929 10:06:40.465998 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865\": container with ID starting with 40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865 not found: ID does not exist" containerID="40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.466040 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865"} err="failed to get container status \"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865\": rpc error: code = NotFound desc = could not find container \"40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865\": container with ID starting with 40b5245eac00287a95c8b00f8c6d769adb11cdf2a57f7302ef53817064744865 not found: ID does not exist" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.466060 4891 scope.go:117] "RemoveContainer" containerID="a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6" Sep 29 10:06:40 crc kubenswrapper[4891]: E0929 10:06:40.474939 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6\": container with ID starting with a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6 not found: ID does not exist" containerID="a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.474989 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6"} err="failed to get container status \"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6\": rpc error: code = NotFound desc = could not find container \"a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6\": container with ID starting with a39e6f0a15611e3cd02b49db7e4c7f404f1b3c67178b2785efd8953d9a079bd6 not found: ID does not exist" Sep 29 10:06:40 crc kubenswrapper[4891]: I0929 10:06:40.573194 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:41 crc kubenswrapper[4891]: I0929 10:06:41.043540 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerStarted","Data":"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c"} Sep 29 10:06:41 crc kubenswrapper[4891]: I0929 10:06:41.047352 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" event={"ID":"27917a33-23af-42a4-a85c-3bf8f1f9c1d0","Type":"ContainerStarted","Data":"4299840af8eb34673b5ae9dec18e4b1d4b83b14a3079ad95a1d68ef73db2207b"} Sep 29 10:06:41 crc kubenswrapper[4891]: I0929 10:06:41.047512 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:41 crc kubenswrapper[4891]: I0929 10:06:41.054936 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerStarted","Data":"a44bb4819fb7f27879bd81effbeb2297eb677d033e974498d7d81fa2f71b50b5"} Sep 29 10:06:41 crc kubenswrapper[4891]: I0929 10:06:41.073466 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" podStartSLOduration=4.073450064 podStartE2EDuration="4.073450064s" podCreationTimestamp="2025-09-29 10:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:41.066455277 +0000 UTC m=+1131.271623598" watchObservedRunningTime="2025-09-29 10:06:41.073450064 +0000 UTC m=+1131.278618385" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.067476 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerStarted","Data":"63e2a990c20ee68d053fc2c1885fb168abc6d7c01bd5a9963201d6591925f826"} Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.071164 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerStarted","Data":"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4"} Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.071314 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api-log" containerID="cri-o://9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" gracePeriod=30 Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.071410 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api" containerID="cri-o://10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" gracePeriod=30 Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.071430 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.077998 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerStarted","Data":"4c12211efc349d32904acd71267efa623c5128327b28628c64a8ab625f187f18"} Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.106431 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.106417497 podStartE2EDuration="4.106417497s" podCreationTimestamp="2025-09-29 10:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:42.105305514 +0000 UTC m=+1132.310473835" watchObservedRunningTime="2025-09-29 10:06:42.106417497 +0000 UTC m=+1132.311585818" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.753306 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791031 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791145 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791234 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791263 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncc7x\" (UniqueName: \"kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791347 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791419 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.791523 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id\") pod \"971c6661-3f72-479c-b95e-ab997fab876a\" (UID: \"971c6661-3f72-479c-b95e-ab997fab876a\") " Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.792032 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.798470 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs" (OuterVolumeSpecName: "logs") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.798894 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts" (OuterVolumeSpecName: "scripts") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.799932 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x" (OuterVolumeSpecName: "kube-api-access-ncc7x") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "kube-api-access-ncc7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.806462 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.819880 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.863669 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data" (OuterVolumeSpecName: "config-data") pod "971c6661-3f72-479c-b95e-ab997fab876a" (UID: "971c6661-3f72-479c-b95e-ab997fab876a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894196 4891 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/971c6661-3f72-479c-b95e-ab997fab876a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894303 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894315 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894324 4891 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894335 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncc7x\" (UniqueName: \"kubernetes.io/projected/971c6661-3f72-479c-b95e-ab997fab876a-kube-api-access-ncc7x\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894344 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971c6661-3f72-479c-b95e-ab997fab876a-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:42 crc kubenswrapper[4891]: I0929 10:06:42.894353 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971c6661-3f72-479c-b95e-ab997fab876a-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.089065 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerStarted","Data":"bec3be5d6daa93371faef7bcfba0a7302c998e9bf2c0f4eccea028d31863d8dd"} Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103023 4891 generic.go:334] "Generic (PLEG): container finished" podID="971c6661-3f72-479c-b95e-ab997fab876a" containerID="10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" exitCode=0 Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103053 4891 generic.go:334] "Generic (PLEG): container finished" podID="971c6661-3f72-479c-b95e-ab997fab876a" containerID="9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" exitCode=143 Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103078 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerDied","Data":"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4"} Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103106 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerDied","Data":"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c"} Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103117 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"971c6661-3f72-479c-b95e-ab997fab876a","Type":"ContainerDied","Data":"00a0c00e672be44db7f9c253db8f2f4593c14891ae402173cf865c3855759b0d"} Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103132 4891 scope.go:117] "RemoveContainer" containerID="10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.103267 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.121247 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.042807435 podStartE2EDuration="6.121218901s" podCreationTimestamp="2025-09-29 10:06:37 +0000 UTC" firstStartedPulling="2025-09-29 10:06:39.674266523 +0000 UTC m=+1129.879434844" lastFinishedPulling="2025-09-29 10:06:40.752677989 +0000 UTC m=+1130.957846310" observedRunningTime="2025-09-29 10:06:43.114115201 +0000 UTC m=+1133.319283542" watchObservedRunningTime="2025-09-29 10:06:43.121218901 +0000 UTC m=+1133.326387222" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.199506 4891 scope.go:117] "RemoveContainer" containerID="9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.207695 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.223973 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236228 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236875 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11722413-5550-4d6a-b328-f57f26166791" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236893 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="11722413-5550-4d6a-b328-f57f26166791" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236915 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc047cc9-f61d-4575-a8bd-d6b69aa77701" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236922 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc047cc9-f61d-4575-a8bd-d6b69aa77701" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236932 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236938 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236946 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon-log" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236952 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon-log" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236968 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0718b8fe-fc19-46ae-8c7f-bdd9908c9730" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236974 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0718b8fe-fc19-46ae-8c7f-bdd9908c9730" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236984 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api-log" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.236989 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api-log" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.236998 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237004 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237182 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon-log" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237192 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237202 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0718b8fe-fc19-46ae-8c7f-bdd9908c9730" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237216 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="11722413-5550-4d6a-b328-f57f26166791" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237237 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc047cc9-f61d-4575-a8bd-d6b69aa77701" containerName="mariadb-database-create" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237246 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="971c6661-3f72-479c-b95e-ab997fab876a" containerName="cinder-api-log" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.237257 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfbacc9-ec34-4515-9874-1fd082cdbea3" containerName="horizon" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.239884 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.244607 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.244912 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.245030 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.245968 4891 scope.go:117] "RemoveContainer" containerID="10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.266688 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4\": container with ID starting with 10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4 not found: ID does not exist" containerID="10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.266762 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4"} err="failed to get container status \"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4\": rpc error: code = NotFound desc = could not find container \"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4\": container with ID starting with 10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4 not found: ID does not exist" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.266855 4891 scope.go:117] "RemoveContainer" containerID="9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" Sep 29 10:06:43 crc kubenswrapper[4891]: E0929 10:06:43.269637 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c\": container with ID starting with 9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c not found: ID does not exist" containerID="9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.269691 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c"} err="failed to get container status \"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c\": rpc error: code = NotFound desc = could not find container \"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c\": container with ID starting with 9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c not found: ID does not exist" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.269740 4891 scope.go:117] "RemoveContainer" containerID="10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.272132 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4"} err="failed to get container status \"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4\": rpc error: code = NotFound desc = could not find container \"10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4\": container with ID starting with 10ddabd915f8ed5f83934de27df64d2ee3c4cdf132e0779919d2eb5e7e9ff1f4 not found: ID does not exist" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.272179 4891 scope.go:117] "RemoveContainer" containerID="9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.272624 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c"} err="failed to get container status \"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c\": rpc error: code = NotFound desc = could not find container \"9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c\": container with ID starting with 9eb7c8af4131187f0f62f635262ba90f12793e20be821ceda658477f2088832c not found: ID does not exist" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.280015 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.290861 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.291143 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-log" containerID="cri-o://d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde" gracePeriod=30 Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.291628 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-httpd" containerID="cri-o://9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9" gracePeriod=30 Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.302640 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77aa5ca3-797d-4f00-8f2d-d735b77d9965-logs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.302690 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-scripts\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.302716 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.302768 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl84h\" (UniqueName: \"kubernetes.io/projected/77aa5ca3-797d-4f00-8f2d-d735b77d9965-kube-api-access-bl84h\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.303731 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-public-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.303773 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data-custom\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.303875 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77aa5ca3-797d-4f00-8f2d-d735b77d9965-etc-machine-id\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.303890 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.303911 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406193 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77aa5ca3-797d-4f00-8f2d-d735b77d9965-logs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406253 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-scripts\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406293 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406398 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl84h\" (UniqueName: \"kubernetes.io/projected/77aa5ca3-797d-4f00-8f2d-d735b77d9965-kube-api-access-bl84h\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406425 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-public-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406462 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data-custom\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406629 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77aa5ca3-797d-4f00-8f2d-d735b77d9965-etc-machine-id\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406643 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.406671 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.407612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77aa5ca3-797d-4f00-8f2d-d735b77d9965-logs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.407683 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77aa5ca3-797d-4f00-8f2d-d735b77d9965-etc-machine-id\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.412136 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-scripts\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.414095 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.417237 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-public-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.417763 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.418632 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-config-data-custom\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.420136 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa5ca3-797d-4f00-8f2d-d735b77d9965-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.427014 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl84h\" (UniqueName: \"kubernetes.io/projected/77aa5ca3-797d-4f00-8f2d-d735b77d9965-kube-api-access-bl84h\") pod \"cinder-api-0\" (UID: \"77aa5ca3-797d-4f00-8f2d-d735b77d9965\") " pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.589882 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.610188 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.713013 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-86a2-account-create-dtlnc"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.714293 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.719698 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.729941 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86a2-account-create-dtlnc"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.813763 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtd49\" (UniqueName: \"kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49\") pod \"nova-api-86a2-account-create-dtlnc\" (UID: \"f43d7587-9a50-41b6-9e8b-93b2e613c7c3\") " pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.910883 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b627-account-create-hdmpf"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.916686 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtd49\" (UniqueName: \"kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49\") pod \"nova-api-86a2-account-create-dtlnc\" (UID: \"f43d7587-9a50-41b6-9e8b-93b2e613c7c3\") " pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.916702 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.918948 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.923025 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b627-account-create-hdmpf"] Sep 29 10:06:43 crc kubenswrapper[4891]: I0929 10:06:43.943365 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtd49\" (UniqueName: \"kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49\") pod \"nova-api-86a2-account-create-dtlnc\" (UID: \"f43d7587-9a50-41b6-9e8b-93b2e613c7c3\") " pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.019194 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm48k\" (UniqueName: \"kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k\") pod \"nova-cell0-b627-account-create-hdmpf\" (UID: \"1788415f-6215-4d2e-925b-bacb13797579\") " pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.037679 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.114844 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9552-account-create-l4fsm"] Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.116552 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.121977 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm48k\" (UniqueName: \"kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k\") pod \"nova-cell0-b627-account-create-hdmpf\" (UID: \"1788415f-6215-4d2e-925b-bacb13797579\") " pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.128370 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.147627 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm48k\" (UniqueName: \"kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k\") pod \"nova-cell0-b627-account-create-hdmpf\" (UID: \"1788415f-6215-4d2e-925b-bacb13797579\") " pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.154187 4891 generic.go:334] "Generic (PLEG): container finished" podID="45c770aa-803b-4a67-9893-5476845e722b" containerID="d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde" exitCode=143 Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.154306 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerDied","Data":"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde"} Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.166764 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9552-account-create-l4fsm"] Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.175890 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerStarted","Data":"c06fc285a050f7c829a2b0f5b01b5e924b0465b93440583964667fee048f13a8"} Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.176005 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.206842 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.214128 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.610271871 podStartE2EDuration="7.214112116s" podCreationTimestamp="2025-09-29 10:06:37 +0000 UTC" firstStartedPulling="2025-09-29 10:06:38.412516126 +0000 UTC m=+1128.617684447" lastFinishedPulling="2025-09-29 10:06:43.016356371 +0000 UTC m=+1133.221524692" observedRunningTime="2025-09-29 10:06:44.213352534 +0000 UTC m=+1134.418520855" watchObservedRunningTime="2025-09-29 10:06:44.214112116 +0000 UTC m=+1134.419280437" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.223532 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndj9\" (UniqueName: \"kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9\") pod \"nova-cell1-9552-account-create-l4fsm\" (UID: \"a1ca376b-baea-4fb6-94ce-eb3def80c06e\") " pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.292536 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.338100 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndj9\" (UniqueName: \"kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9\") pod \"nova-cell1-9552-account-create-l4fsm\" (UID: \"a1ca376b-baea-4fb6-94ce-eb3def80c06e\") " pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.361416 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndj9\" (UniqueName: \"kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9\") pod \"nova-cell1-9552-account-create-l4fsm\" (UID: \"a1ca376b-baea-4fb6-94ce-eb3def80c06e\") " pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.451104 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971c6661-3f72-479c-b95e-ab997fab876a" path="/var/lib/kubelet/pods/971c6661-3f72-479c-b95e-ab997fab876a/volumes" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.480245 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.537102 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.593311 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.649526 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86a2-account-create-dtlnc"] Sep 29 10:06:44 crc kubenswrapper[4891]: I0929 10:06:44.926091 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b627-account-create-hdmpf"] Sep 29 10:06:44 crc kubenswrapper[4891]: W0929 10:06:44.935972 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1788415f_6215_4d2e_925b_bacb13797579.slice/crio-07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f WatchSource:0}: Error finding container 07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f: Status 404 returned error can't find the container with id 07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.127961 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.128269 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-log" containerID="cri-o://dc9d8171bcdde9cf808937e4987bab002f8d57bf3f88268920642ae422da90c6" gracePeriod=30 Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.128857 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-httpd" containerID="cri-o://c96f65321403a5f5053ff8ebc61e35675865e2e4730ce75835100b436eb5a932" gracePeriod=30 Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.201474 4891 generic.go:334] "Generic (PLEG): container finished" podID="f43d7587-9a50-41b6-9e8b-93b2e613c7c3" containerID="65271b9cb8a5b2fd2dd5a4ba84fdeb674ef6fe6d3e09305dd5af33edf0ad69cd" exitCode=0 Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.201563 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a2-account-create-dtlnc" event={"ID":"f43d7587-9a50-41b6-9e8b-93b2e613c7c3","Type":"ContainerDied","Data":"65271b9cb8a5b2fd2dd5a4ba84fdeb674ef6fe6d3e09305dd5af33edf0ad69cd"} Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.201592 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a2-account-create-dtlnc" event={"ID":"f43d7587-9a50-41b6-9e8b-93b2e613c7c3","Type":"ContainerStarted","Data":"8baca63c015d5001fece00a179ca9fd9c630c1cb8d4353a56752c42ac65ccee6"} Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.204104 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"77aa5ca3-797d-4f00-8f2d-d735b77d9965","Type":"ContainerStarted","Data":"71a25de7a6d2e52473d89ad2ea197395f66f4832b800fdbffb1aa0678e7db351"} Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.208874 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b627-account-create-hdmpf" event={"ID":"1788415f-6215-4d2e-925b-bacb13797579","Type":"ContainerStarted","Data":"07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f"} Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.247927 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9552-account-create-l4fsm"] Sep 29 10:06:45 crc kubenswrapper[4891]: I0929 10:06:45.495253 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-696f7ffc96-xhjxt" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.217461 4891 generic.go:334] "Generic (PLEG): container finished" podID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerID="dc9d8171bcdde9cf808937e4987bab002f8d57bf3f88268920642ae422da90c6" exitCode=143 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.217542 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerDied","Data":"dc9d8171bcdde9cf808937e4987bab002f8d57bf3f88268920642ae422da90c6"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.220408 4891 generic.go:334] "Generic (PLEG): container finished" podID="a1ca376b-baea-4fb6-94ce-eb3def80c06e" containerID="6a57541754cdf69b2f3908fae36525ce3f9ee5bef4fd14b7a5e1c5e9fc0dfbd7" exitCode=0 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.220466 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9552-account-create-l4fsm" event={"ID":"a1ca376b-baea-4fb6-94ce-eb3def80c06e","Type":"ContainerDied","Data":"6a57541754cdf69b2f3908fae36525ce3f9ee5bef4fd14b7a5e1c5e9fc0dfbd7"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.220538 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9552-account-create-l4fsm" event={"ID":"a1ca376b-baea-4fb6-94ce-eb3def80c06e","Type":"ContainerStarted","Data":"c062d688cf4e2dd56edd9fa3f2549b7e2975a516e685c29799ade96ce7a05cef"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.258004 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"77aa5ca3-797d-4f00-8f2d-d735b77d9965","Type":"ContainerStarted","Data":"3e71de5c09842561ffcd0219bef7f8ca10f8587939a5c8443501d3839f5136fc"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.258052 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"77aa5ca3-797d-4f00-8f2d-d735b77d9965","Type":"ContainerStarted","Data":"13e960e99b28109d6e4e2c4919442196b0695c0237ef0575aac4f3a4a84b999d"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.259540 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.262344 4891 generic.go:334] "Generic (PLEG): container finished" podID="1788415f-6215-4d2e-925b-bacb13797579" containerID="311ce8af5d677477c40eb7933b86dced495a406b0d7075bdeff78e892d32efd1" exitCode=0 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.262511 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b627-account-create-hdmpf" event={"ID":"1788415f-6215-4d2e-925b-bacb13797579","Type":"ContainerDied","Data":"311ce8af5d677477c40eb7933b86dced495a406b0d7075bdeff78e892d32efd1"} Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.262694 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-central-agent" containerID="cri-o://f3385b43b3813904574c23b8b51608d73f349a3e0e6433edd0efac72b4ffdeab" gracePeriod=30 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.262960 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="proxy-httpd" containerID="cri-o://c06fc285a050f7c829a2b0f5b01b5e924b0465b93440583964667fee048f13a8" gracePeriod=30 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.263015 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="sg-core" containerID="cri-o://4c12211efc349d32904acd71267efa623c5128327b28628c64a8ab625f187f18" gracePeriod=30 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.263059 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-notification-agent" containerID="cri-o://a44bb4819fb7f27879bd81effbeb2297eb677d033e974498d7d81fa2f71b50b5" gracePeriod=30 Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.276253 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.276233829 podStartE2EDuration="3.276233829s" podCreationTimestamp="2025-09-29 10:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:46.274762786 +0000 UTC m=+1136.479931117" watchObservedRunningTime="2025-09-29 10:06:46.276233829 +0000 UTC m=+1136.481402150" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.664353 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.734989 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtd49\" (UniqueName: \"kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49\") pod \"f43d7587-9a50-41b6-9e8b-93b2e613c7c3\" (UID: \"f43d7587-9a50-41b6-9e8b-93b2e613c7c3\") " Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.743607 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49" (OuterVolumeSpecName: "kube-api-access-xtd49") pod "f43d7587-9a50-41b6-9e8b-93b2e613c7c3" (UID: "f43d7587-9a50-41b6-9e8b-93b2e613c7c3"). InnerVolumeSpecName "kube-api-access-xtd49". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.837544 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtd49\" (UniqueName: \"kubernetes.io/projected/f43d7587-9a50-41b6-9e8b-93b2e613c7c3-kube-api-access-xtd49\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.873363 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:46 crc kubenswrapper[4891]: I0929 10:06:46.878350 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7df6f5468c-2kcvk" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.149146 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249605 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249714 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249760 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249821 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249857 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.249910 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.250087 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.250219 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlczz\" (UniqueName: \"kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz\") pod \"45c770aa-803b-4a67-9893-5476845e722b\" (UID: \"45c770aa-803b-4a67-9893-5476845e722b\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.257345 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.257646 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs" (OuterVolumeSpecName: "logs") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.278136 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.278260 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts" (OuterVolumeSpecName: "scripts") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.305072 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz" (OuterVolumeSpecName: "kube-api-access-rlczz") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "kube-api-access-rlczz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.337429 4891 generic.go:334] "Generic (PLEG): container finished" podID="45c770aa-803b-4a67-9893-5476845e722b" containerID="9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9" exitCode=0 Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.337516 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerDied","Data":"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.337543 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45c770aa-803b-4a67-9893-5476845e722b","Type":"ContainerDied","Data":"1fe8a0d195a9efdb43e00fe480201a341fd91b9f17e4bdaaf6e37f0527da4076"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.337560 4891 scope.go:117] "RemoveContainer" containerID="9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.337706 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.341991 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353089 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlczz\" (UniqueName: \"kubernetes.io/projected/45c770aa-803b-4a67-9893-5476845e722b-kube-api-access-rlczz\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353119 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353129 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353139 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353166 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.353175 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45c770aa-803b-4a67-9893-5476845e722b-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.364226 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a2-account-create-dtlnc" event={"ID":"f43d7587-9a50-41b6-9e8b-93b2e613c7c3","Type":"ContainerDied","Data":"8baca63c015d5001fece00a179ca9fd9c630c1cb8d4353a56752c42ac65ccee6"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.364277 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8baca63c015d5001fece00a179ca9fd9c630c1cb8d4353a56752c42ac65ccee6" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.364339 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a2-account-create-dtlnc" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.378356 4891 generic.go:334] "Generic (PLEG): container finished" podID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerID="c06fc285a050f7c829a2b0f5b01b5e924b0465b93440583964667fee048f13a8" exitCode=0 Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.378380 4891 generic.go:334] "Generic (PLEG): container finished" podID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerID="4c12211efc349d32904acd71267efa623c5128327b28628c64a8ab625f187f18" exitCode=2 Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.378390 4891 generic.go:334] "Generic (PLEG): container finished" podID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerID="a44bb4819fb7f27879bd81effbeb2297eb677d033e974498d7d81fa2f71b50b5" exitCode=0 Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.378399 4891 generic.go:334] "Generic (PLEG): container finished" podID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerID="f3385b43b3813904574c23b8b51608d73f349a3e0e6433edd0efac72b4ffdeab" exitCode=0 Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.379092 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerDied","Data":"c06fc285a050f7c829a2b0f5b01b5e924b0465b93440583964667fee048f13a8"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.379159 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerDied","Data":"4c12211efc349d32904acd71267efa623c5128327b28628c64a8ab625f187f18"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.379169 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerDied","Data":"a44bb4819fb7f27879bd81effbeb2297eb677d033e974498d7d81fa2f71b50b5"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.379178 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerDied","Data":"f3385b43b3813904574c23b8b51608d73f349a3e0e6433edd0efac72b4ffdeab"} Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.383094 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data" (OuterVolumeSpecName: "config-data") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.431682 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.453258 4891 scope.go:117] "RemoveContainer" containerID="d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.472725 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.472758 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.482434 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "45c770aa-803b-4a67-9893-5476845e722b" (UID: "45c770aa-803b-4a67-9893-5476845e722b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.482527 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.514191 4891 scope.go:117] "RemoveContainer" containerID="9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.518056 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9\": container with ID starting with 9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9 not found: ID does not exist" containerID="9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.518099 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9"} err="failed to get container status \"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9\": rpc error: code = NotFound desc = could not find container \"9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9\": container with ID starting with 9103a76e6bcf89965269b9e15298120cb4b4733599df4c283fe1de2ab1b5b6f9 not found: ID does not exist" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.518125 4891 scope.go:117] "RemoveContainer" containerID="d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.518369 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde\": container with ID starting with d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde not found: ID does not exist" containerID="d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.518388 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde"} err="failed to get container status \"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde\": rpc error: code = NotFound desc = could not find container \"d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde\": container with ID starting with d0669b907cc3d7e8c87cadfaf246355966f47174c884b019614d59bfb2f25fde not found: ID does not exist" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.574339 4891 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c770aa-803b-4a67-9893-5476845e722b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676603 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676702 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676730 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676773 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942zl\" (UniqueName: \"kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676826 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.676880 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.677117 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle\") pod \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\" (UID: \"fb3d540e-66d9-4d98-8cf2-a75d809d76f8\") " Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.682725 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.682959 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.691739 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.692188 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts" (OuterVolumeSpecName: "scripts") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.703943 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl" (OuterVolumeSpecName: "kube-api-access-942zl") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "kube-api-access-942zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.713099 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723233 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723696 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-log" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723715 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-log" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723740 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-notification-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723747 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-notification-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723759 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="sg-core" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723766 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="sg-core" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723779 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723800 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723816 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="proxy-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723823 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="proxy-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723846 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-central-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723854 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-central-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: E0929 10:06:47.723864 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43d7587-9a50-41b6-9e8b-93b2e613c7c3" containerName="mariadb-account-create" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.723871 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43d7587-9a50-41b6-9e8b-93b2e613c7c3" containerName="mariadb-account-create" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724057 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43d7587-9a50-41b6-9e8b-93b2e613c7c3" containerName="mariadb-account-create" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724073 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724080 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="proxy-httpd" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724091 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-central-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724102 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="ceilometer-notification-agent" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724152 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c770aa-803b-4a67-9893-5476845e722b" containerName="glance-log" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.724164 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" containerName="sg-core" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.729246 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.739327 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.740856 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.744564 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.744774 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.781334 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942zl\" (UniqueName: \"kubernetes.io/projected/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-kube-api-access-942zl\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.781364 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.781374 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.781384 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.781393 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.808462 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.877348 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data" (OuterVolumeSpecName: "config-data") pod "fb3d540e-66d9-4d98-8cf2-a75d809d76f8" (UID: "fb3d540e-66d9-4d98-8cf2-a75d809d76f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.882953 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb97\" (UniqueName: \"kubernetes.io/projected/a6e7444d-97cc-440f-92de-e9db5ff440b5-kube-api-access-8bb97\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883031 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883064 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883109 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883130 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883156 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883170 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883220 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883280 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.883293 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3d540e-66d9-4d98-8cf2-a75d809d76f8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985330 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb97\" (UniqueName: \"kubernetes.io/projected/a6e7444d-97cc-440f-92de-e9db5ff440b5-kube-api-access-8bb97\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985416 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985463 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985515 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985541 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985569 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985593 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985665 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.985890 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.986328 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.987185 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6e7444d-97cc-440f-92de-e9db5ff440b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.990574 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.990851 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:47 crc kubenswrapper[4891]: I0929 10:06:47.992411 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.004926 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e7444d-97cc-440f-92de-e9db5ff440b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.019682 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb97\" (UniqueName: \"kubernetes.io/projected/a6e7444d-97cc-440f-92de-e9db5ff440b5-kube-api-access-8bb97\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.031207 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6e7444d-97cc-440f-92de-e9db5ff440b5\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.085380 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.123228 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.165341 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.293680 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zndj9\" (UniqueName: \"kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9\") pod \"a1ca376b-baea-4fb6-94ce-eb3def80c06e\" (UID: \"a1ca376b-baea-4fb6-94ce-eb3def80c06e\") " Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.294643 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm48k\" (UniqueName: \"kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k\") pod \"1788415f-6215-4d2e-925b-bacb13797579\" (UID: \"1788415f-6215-4d2e-925b-bacb13797579\") " Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.300782 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k" (OuterVolumeSpecName: "kube-api-access-mm48k") pod "1788415f-6215-4d2e-925b-bacb13797579" (UID: "1788415f-6215-4d2e-925b-bacb13797579"). InnerVolumeSpecName "kube-api-access-mm48k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.301320 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9" (OuterVolumeSpecName: "kube-api-access-zndj9") pod "a1ca376b-baea-4fb6-94ce-eb3def80c06e" (UID: "a1ca376b-baea-4fb6-94ce-eb3def80c06e"). InnerVolumeSpecName "kube-api-access-zndj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.316325 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.390190 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.390483 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="dnsmasq-dns" containerID="cri-o://936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00" gracePeriod=10 Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.397447 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zndj9\" (UniqueName: \"kubernetes.io/projected/a1ca376b-baea-4fb6-94ce-eb3def80c06e-kube-api-access-zndj9\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.398340 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm48k\" (UniqueName: \"kubernetes.io/projected/1788415f-6215-4d2e-925b-bacb13797579-kube-api-access-mm48k\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.405754 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9552-account-create-l4fsm" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.409475 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b627-account-create-hdmpf" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.411557 4891 generic.go:334] "Generic (PLEG): container finished" podID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerID="c96f65321403a5f5053ff8ebc61e35675865e2e4730ce75835100b436eb5a932" exitCode=0 Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.415489 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.431989 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c770aa-803b-4a67-9893-5476845e722b" path="/var/lib/kubelet/pods/45c770aa-803b-4a67-9893-5476845e722b/volumes" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435783 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9552-account-create-l4fsm" event={"ID":"a1ca376b-baea-4fb6-94ce-eb3def80c06e","Type":"ContainerDied","Data":"c062d688cf4e2dd56edd9fa3f2549b7e2975a516e685c29799ade96ce7a05cef"} Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435859 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c062d688cf4e2dd56edd9fa3f2549b7e2975a516e685c29799ade96ce7a05cef" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435879 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b627-account-create-hdmpf" event={"ID":"1788415f-6215-4d2e-925b-bacb13797579","Type":"ContainerDied","Data":"07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f"} Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435891 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07458622a353a56cfa7267ebf4ea543d257d05d2ba9745d88f5a409f1e279d2f" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435900 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerDied","Data":"c96f65321403a5f5053ff8ebc61e35675865e2e4730ce75835100b436eb5a932"} Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435913 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb3d540e-66d9-4d98-8cf2-a75d809d76f8","Type":"ContainerDied","Data":"fca0246e3c86aebbc4f681384df8b448adb73e73120c34fb54475641990303f1"} Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.435939 4891 scope.go:117] "RemoveContainer" containerID="c06fc285a050f7c829a2b0f5b01b5e924b0465b93440583964667fee048f13a8" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.512976 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.514052 4891 scope.go:117] "RemoveContainer" containerID="4c12211efc349d32904acd71267efa623c5128327b28628c64a8ab625f187f18" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.521815 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.536077 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:48 crc kubenswrapper[4891]: E0929 10:06:48.536481 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ca376b-baea-4fb6-94ce-eb3def80c06e" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.536493 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ca376b-baea-4fb6-94ce-eb3def80c06e" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: E0929 10:06:48.536521 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1788415f-6215-4d2e-925b-bacb13797579" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.536527 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1788415f-6215-4d2e-925b-bacb13797579" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.536708 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1788415f-6215-4d2e-925b-bacb13797579" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.536726 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ca376b-baea-4fb6-94ce-eb3def80c06e" containerName="mariadb-account-create" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.538430 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.546864 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.547181 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.555591 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.565823 4891 scope.go:117] "RemoveContainer" containerID="a44bb4819fb7f27879bd81effbeb2297eb677d033e974498d7d81fa2f71b50b5" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.597647 4891 scope.go:117] "RemoveContainer" containerID="f3385b43b3813904574c23b8b51608d73f349a3e0e6433edd0efac72b4ffdeab" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704487 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704542 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704623 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704664 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704712 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704730 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4b5\" (UniqueName: \"kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.704774 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.714166 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806253 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806319 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806359 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806378 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4b5\" (UniqueName: \"kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806422 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806473 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.806489 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.807071 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.811883 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.812104 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.813418 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.814801 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.820546 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.825450 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4b5\" (UniqueName: \"kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5\") pod \"ceilometer-0\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.859487 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.867923 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:06:48 crc kubenswrapper[4891]: I0929 10:06:48.939718 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.094816 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.121142 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.227887 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228264 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228315 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228372 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt77m\" (UniqueName: \"kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228485 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228498 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228534 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmm6l\" (UniqueName: \"kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228602 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228631 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228673 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228715 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228736 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228757 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc\") pod \"eeaee9da-afe7-4854-95a3-ac91aeac850e\" (UID: \"eeaee9da-afe7-4854-95a3-ac91aeac850e\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228780 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.228841 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs\") pod \"076fdc73-b211-49f2-89d6-ee6fff802f74\" (UID: \"076fdc73-b211-49f2-89d6-ee6fff802f74\") " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.229304 4891 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.229849 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs" (OuterVolumeSpecName: "logs") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.234347 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l" (OuterVolumeSpecName: "kube-api-access-kmm6l") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "kube-api-access-kmm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.251782 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m" (OuterVolumeSpecName: "kube-api-access-tt77m") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "kube-api-access-tt77m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.259635 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.263938 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts" (OuterVolumeSpecName: "scripts") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.303281 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.331736 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332247 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmm6l\" (UniqueName: \"kubernetes.io/projected/076fdc73-b211-49f2-89d6-ee6fff802f74-kube-api-access-kmm6l\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332269 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332292 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332302 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332311 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332321 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076fdc73-b211-49f2-89d6-ee6fff802f74-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.332330 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt77m\" (UniqueName: \"kubernetes.io/projected/eeaee9da-afe7-4854-95a3-ac91aeac850e-kube-api-access-tt77m\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.355129 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.362154 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.372961 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.393974 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data" (OuterVolumeSpecName: "config-data") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.399658 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config" (OuterVolumeSpecName: "config") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.418979 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "076fdc73-b211-49f2-89d6-ee6fff802f74" (UID: "076fdc73-b211-49f2-89d6-ee6fff802f74"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.421134 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eeaee9da-afe7-4854-95a3-ac91aeac850e" (UID: "eeaee9da-afe7-4854-95a3-ac91aeac850e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437331 4891 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437367 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437376 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437387 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437422 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437434 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076fdc73-b211-49f2-89d6-ee6fff802f74-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.437444 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eeaee9da-afe7-4854-95a3-ac91aeac850e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.446719 4891 generic.go:334] "Generic (PLEG): container finished" podID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerID="936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00" exitCode=0 Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.447005 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.447209 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6e7444d-97cc-440f-92de-e9db5ff440b5","Type":"ContainerStarted","Data":"54c8eefab7c0b5da69af24f51e5af44807d03b3aa93b16608a9c9b99c8a5dd45"} Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.447279 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" event={"ID":"eeaee9da-afe7-4854-95a3-ac91aeac850e","Type":"ContainerDied","Data":"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00"} Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.447296 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-hwd4m" event={"ID":"eeaee9da-afe7-4854-95a3-ac91aeac850e","Type":"ContainerDied","Data":"de826fc7ee84a301aa4432579ff102902b1c852b020a29cbaaf175771cc60706"} Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.447312 4891 scope.go:117] "RemoveContainer" containerID="936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.453023 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="cinder-scheduler" containerID="cri-o://63e2a990c20ee68d053fc2c1885fb168abc6d7c01bd5a9963201d6591925f826" gracePeriod=30 Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.453258 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.453447 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"076fdc73-b211-49f2-89d6-ee6fff802f74","Type":"ContainerDied","Data":"186f06960a826dcb07dcf17adc02b3c1ac8d2c13ec642e0fb352ef4a0982656c"} Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.453577 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="probe" containerID="cri-o://bec3be5d6daa93371faef7bcfba0a7302c998e9bf2c0f4eccea028d31863d8dd" gracePeriod=30 Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.505459 4891 scope.go:117] "RemoveContainer" containerID="9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.517538 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.528860 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-hwd4m"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.543034 4891 scope.go:117] "RemoveContainer" containerID="936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00" Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.545059 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00\": container with ID starting with 936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00 not found: ID does not exist" containerID="936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.545100 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00"} err="failed to get container status \"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00\": rpc error: code = NotFound desc = could not find container \"936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00\": container with ID starting with 936cd5a0ee9dfc94920ca83369700abd1c8da7b79e8649b713110f41cd242f00 not found: ID does not exist" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.545130 4891 scope.go:117] "RemoveContainer" containerID="9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9" Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.546411 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9\": container with ID starting with 9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9 not found: ID does not exist" containerID="9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.546439 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9"} err="failed to get container status \"9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9\": rpc error: code = NotFound desc = could not find container \"9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9\": container with ID starting with 9b60197635c0a43a6a5db94c4bef35282804d122359d9a2ab3cfcb9b62f122a9 not found: ID does not exist" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.546460 4891 scope.go:117] "RemoveContainer" containerID="c96f65321403a5f5053ff8ebc61e35675865e2e4730ce75835100b436eb5a932" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.548938 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: W0929 10:06:49.562475 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a3896b_fa69_42c2_bf4f_fa7139fb0c31.slice/crio-d9fbf0d3d21573fc92017f2f3bb6ac688895acb2fb0c3e9cdfd9fb314c993a0a WatchSource:0}: Error finding container d9fbf0d3d21573fc92017f2f3bb6ac688895acb2fb0c3e9cdfd9fb314c993a0a: Status 404 returned error can't find the container with id d9fbf0d3d21573fc92017f2f3bb6ac688895acb2fb0c3e9cdfd9fb314c993a0a Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.565082 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.579100 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.588991 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.589576 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="dnsmasq-dns" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.589597 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="dnsmasq-dns" Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.589625 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-log" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.589634 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-log" Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.589655 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-httpd" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.589663 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-httpd" Sep 29 10:06:49 crc kubenswrapper[4891]: E0929 10:06:49.589679 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="init" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.589687 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="init" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.589778 4891 scope.go:117] "RemoveContainer" containerID="dc9d8171bcdde9cf808937e4987bab002f8d57bf3f88268920642ae422da90c6" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.590005 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-httpd" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.590048 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" containerName="dnsmasq-dns" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.590060 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" containerName="glance-log" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.591659 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.595386 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.595507 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.596766 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748691 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748769 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748843 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748875 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748957 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-logs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.748986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.749032 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54gb\" (UniqueName: \"kubernetes.io/projected/ca02e873-8e2c-4958-a757-92efa57fdea8-kube-api-access-p54gb\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.749082 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851212 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851365 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-logs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851404 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851450 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54gb\" (UniqueName: \"kubernetes.io/projected/ca02e873-8e2c-4958-a757-92efa57fdea8-kube-api-access-p54gb\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851494 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851553 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851597 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.851634 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.852387 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.852904 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.853005 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca02e873-8e2c-4958-a757-92efa57fdea8-logs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.858877 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.860395 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.863100 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.867194 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca02e873-8e2c-4958-a757-92efa57fdea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.872985 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54gb\" (UniqueName: \"kubernetes.io/projected/ca02e873-8e2c-4958-a757-92efa57fdea8-kube-api-access-p54gb\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:49 crc kubenswrapper[4891]: I0929 10:06:49.936746 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ca02e873-8e2c-4958-a757-92efa57fdea8\") " pod="openstack/glance-default-external-api-0" Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.228424 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.418553 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076fdc73-b211-49f2-89d6-ee6fff802f74" path="/var/lib/kubelet/pods/076fdc73-b211-49f2-89d6-ee6fff802f74/volumes" Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.419657 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeaee9da-afe7-4854-95a3-ac91aeac850e" path="/var/lib/kubelet/pods/eeaee9da-afe7-4854-95a3-ac91aeac850e/volumes" Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.420544 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3d540e-66d9-4d98-8cf2-a75d809d76f8" path="/var/lib/kubelet/pods/fb3d540e-66d9-4d98-8cf2-a75d809d76f8/volumes" Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.489104 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerStarted","Data":"d9fbf0d3d21573fc92017f2f3bb6ac688895acb2fb0c3e9cdfd9fb314c993a0a"} Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.503653 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6e7444d-97cc-440f-92de-e9db5ff440b5","Type":"ContainerStarted","Data":"b0012c2512fe116acc9a88ff3568c4bfd605844d7870664147c6d7645454c0ca"} Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.503698 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6e7444d-97cc-440f-92de-e9db5ff440b5","Type":"ContainerStarted","Data":"6bc01a974e745298da743a8b675f586b69fe914578b58ad8dc604b89e6615d8e"} Sep 29 10:06:50 crc kubenswrapper[4891]: I0929 10:06:50.876403 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.517740 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca02e873-8e2c-4958-a757-92efa57fdea8","Type":"ContainerStarted","Data":"d04eb3d1cd7e583deacfcbd95d1ffaab96eaf68c1b26ea7ce2e749e04fc5b503"} Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.519887 4891 generic.go:334] "Generic (PLEG): container finished" podID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerID="bec3be5d6daa93371faef7bcfba0a7302c998e9bf2c0f4eccea028d31863d8dd" exitCode=0 Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.519926 4891 generic.go:334] "Generic (PLEG): container finished" podID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerID="63e2a990c20ee68d053fc2c1885fb168abc6d7c01bd5a9963201d6591925f826" exitCode=0 Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.520319 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerDied","Data":"bec3be5d6daa93371faef7bcfba0a7302c998e9bf2c0f4eccea028d31863d8dd"} Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.520364 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerDied","Data":"63e2a990c20ee68d053fc2c1885fb168abc6d7c01bd5a9963201d6591925f826"} Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.520385 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"803e2d14-8309-4637-9f99-d0903e7ac08e","Type":"ContainerDied","Data":"c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e"} Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.520396 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e0e572db8cdea63ab3506a60984d7f7c4d88252a61bf9fc2674c84c4171f0e" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.548467 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.548447816 podStartE2EDuration="4.548447816s" podCreationTimestamp="2025-09-29 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:51.543404527 +0000 UTC m=+1141.748572858" watchObservedRunningTime="2025-09-29 10:06:51.548447816 +0000 UTC m=+1141.753616127" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.570692 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708537 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708640 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tfdr\" (UniqueName: \"kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708714 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708740 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708757 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.708863 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle\") pod \"803e2d14-8309-4637-9f99-d0903e7ac08e\" (UID: \"803e2d14-8309-4637-9f99-d0903e7ac08e\") " Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.709031 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.709437 4891 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/803e2d14-8309-4637-9f99-d0903e7ac08e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.717953 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.718027 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts" (OuterVolumeSpecName: "scripts") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.718194 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr" (OuterVolumeSpecName: "kube-api-access-4tfdr") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "kube-api-access-4tfdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.779619 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.812021 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tfdr\" (UniqueName: \"kubernetes.io/projected/803e2d14-8309-4637-9f99-d0903e7ac08e-kube-api-access-4tfdr\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.812067 4891 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.812082 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.812096 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.831139 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data" (OuterVolumeSpecName: "config-data") pod "803e2d14-8309-4637-9f99-d0903e7ac08e" (UID: "803e2d14-8309-4637-9f99-d0903e7ac08e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:51 crc kubenswrapper[4891]: I0929 10:06:51.914166 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e2d14-8309-4637-9f99-d0903e7ac08e-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.542259 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerStarted","Data":"ae592288fee06d9462ebe2ea7036637b6e893a3439821c26330cc1274241da3b"} Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.545257 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.545241 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca02e873-8e2c-4958-a757-92efa57fdea8","Type":"ContainerStarted","Data":"c7aef070d8494ff3d903478088a9659d61fdc05ccddaed83e2c35d2fd0fbeb7e"} Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.581330 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.590341 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.602294 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:52 crc kubenswrapper[4891]: E0929 10:06:52.602769 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="cinder-scheduler" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.602798 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="cinder-scheduler" Sep 29 10:06:52 crc kubenswrapper[4891]: E0929 10:06:52.602822 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="probe" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.602830 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="probe" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.603040 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="cinder-scheduler" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.603060 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" containerName="probe" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.604044 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.606370 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.613101 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742562 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742625 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/67d0b646-e147-42d3-8ef9-9001b2b24313-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742668 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-scripts\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742726 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742781 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.742861 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtps2\" (UniqueName: \"kubernetes.io/projected/67d0b646-e147-42d3-8ef9-9001b2b24313-kube-api-access-mtps2\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845235 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845283 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845342 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtps2\" (UniqueName: \"kubernetes.io/projected/67d0b646-e147-42d3-8ef9-9001b2b24313-kube-api-access-mtps2\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845375 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845405 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/67d0b646-e147-42d3-8ef9-9001b2b24313-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.845441 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-scripts\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.851273 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/67d0b646-e147-42d3-8ef9-9001b2b24313-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.852124 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-scripts\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.852712 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.859136 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.860554 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67d0b646-e147-42d3-8ef9-9001b2b24313-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.869956 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtps2\" (UniqueName: \"kubernetes.io/projected/67d0b646-e147-42d3-8ef9-9001b2b24313-kube-api-access-mtps2\") pod \"cinder-scheduler-0\" (UID: \"67d0b646-e147-42d3-8ef9-9001b2b24313\") " pod="openstack/cinder-scheduler-0" Sep 29 10:06:52 crc kubenswrapper[4891]: I0929 10:06:52.923970 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:06:53 crc kubenswrapper[4891]: I0929 10:06:53.501467 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:06:53 crc kubenswrapper[4891]: I0929 10:06:53.555922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerStarted","Data":"5363bc3628edec075ea6023d3f89fdfb511a4993064bde2b10f9c3b24d9013f1"} Sep 29 10:06:53 crc kubenswrapper[4891]: I0929 10:06:53.557574 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca02e873-8e2c-4958-a757-92efa57fdea8","Type":"ContainerStarted","Data":"6a4d7638d1eddd6d9dba39e2d426b0eeb15f13f970979b1c282727767290f3f2"} Sep 29 10:06:53 crc kubenswrapper[4891]: I0929 10:06:53.560152 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"67d0b646-e147-42d3-8ef9-9001b2b24313","Type":"ContainerStarted","Data":"7ab2a57b86392949e3e91022853d7c68dfe67639ed381cfb16c61f9c91b2dc54"} Sep 29 10:06:53 crc kubenswrapper[4891]: I0929 10:06:53.580728 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.580710466 podStartE2EDuration="4.580710466s" podCreationTimestamp="2025-09-29 10:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:53.574586625 +0000 UTC m=+1143.779754956" watchObservedRunningTime="2025-09-29 10:06:53.580710466 +0000 UTC m=+1143.785878787" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.419643 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803e2d14-8309-4637-9f99-d0903e7ac08e" path="/var/lib/kubelet/pods/803e2d14-8309-4637-9f99-d0903e7ac08e/volumes" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.421373 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jt2hn"] Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.422611 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.427395 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.430619 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jt2hn"] Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.450075 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.450751 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ps2k7" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.496487 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.496583 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr5n\" (UniqueName: \"kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.496635 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.496709 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.575464 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"67d0b646-e147-42d3-8ef9-9001b2b24313","Type":"ContainerStarted","Data":"8fda7326f1cfd4dcb0332b83efb34711e5588a5f46547bca6d6a77da97c1831c"} Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.577428 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerStarted","Data":"cd7228a0a9e2f1de392c2bb43fd16881e4089be661cb1c959772224b90b20a04"} Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.602920 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.603115 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr5n\" (UniqueName: \"kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.603196 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.603430 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.610198 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.610624 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.626516 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr5n\" (UniqueName: \"kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.627240 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data\") pod \"nova-cell0-conductor-db-sync-jt2hn\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:54 crc kubenswrapper[4891]: I0929 10:06:54.916054 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.417014 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jt2hn"] Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.603247 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" event={"ID":"2f8acd86-ee16-42c5-9309-7651699a0886","Type":"ContainerStarted","Data":"5bad63b465fb9b6045ae9603e7e816428de3aecdf941fd988b8507a1c582601a"} Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.606963 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerStarted","Data":"f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d"} Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.607104 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.611922 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"67d0b646-e147-42d3-8ef9-9001b2b24313","Type":"ContainerStarted","Data":"412f4fef9288f0c751353f5a1470441e452ecaa0ea9e030622ffc27c0d999731"} Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.632902 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.12033336 podStartE2EDuration="7.632861884s" podCreationTimestamp="2025-09-29 10:06:48 +0000 UTC" firstStartedPulling="2025-09-29 10:06:49.590056691 +0000 UTC m=+1139.795225022" lastFinishedPulling="2025-09-29 10:06:55.102585225 +0000 UTC m=+1145.307753546" observedRunningTime="2025-09-29 10:06:55.625510026 +0000 UTC m=+1145.830678367" watchObservedRunningTime="2025-09-29 10:06:55.632861884 +0000 UTC m=+1145.838030215" Sep 29 10:06:55 crc kubenswrapper[4891]: I0929 10:06:55.643643 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.643618642 podStartE2EDuration="3.643618642s" podCreationTimestamp="2025-09-29 10:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:06:55.643257421 +0000 UTC m=+1145.848425752" watchObservedRunningTime="2025-09-29 10:06:55.643618642 +0000 UTC m=+1145.848786953" Sep 29 10:06:56 crc kubenswrapper[4891]: I0929 10:06:56.278465 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:06:56 crc kubenswrapper[4891]: I0929 10:06:56.296439 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 29 10:06:57 crc kubenswrapper[4891]: I0929 10:06:57.631101 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-central-agent" containerID="cri-o://ae592288fee06d9462ebe2ea7036637b6e893a3439821c26330cc1274241da3b" gracePeriod=30 Sep 29 10:06:57 crc kubenswrapper[4891]: I0929 10:06:57.633063 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="proxy-httpd" containerID="cri-o://f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d" gracePeriod=30 Sep 29 10:06:57 crc kubenswrapper[4891]: I0929 10:06:57.633140 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="sg-core" containerID="cri-o://cd7228a0a9e2f1de392c2bb43fd16881e4089be661cb1c959772224b90b20a04" gracePeriod=30 Sep 29 10:06:57 crc kubenswrapper[4891]: I0929 10:06:57.633222 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-notification-agent" containerID="cri-o://5363bc3628edec075ea6023d3f89fdfb511a4993064bde2b10f9c3b24d9013f1" gracePeriod=30 Sep 29 10:06:57 crc kubenswrapper[4891]: I0929 10:06:57.924132 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 10:06:57 crc kubenswrapper[4891]: E0929 10:06:57.986460 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a3896b_fa69_42c2_bf4f_fa7139fb0c31.slice/crio-conmon-f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.086298 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.086341 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.131483 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.153571 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646025 4891 generic.go:334] "Generic (PLEG): container finished" podID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerID="f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d" exitCode=0 Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646427 4891 generic.go:334] "Generic (PLEG): container finished" podID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerID="cd7228a0a9e2f1de392c2bb43fd16881e4089be661cb1c959772224b90b20a04" exitCode=2 Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646435 4891 generic.go:334] "Generic (PLEG): container finished" podID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerID="5363bc3628edec075ea6023d3f89fdfb511a4993064bde2b10f9c3b24d9013f1" exitCode=0 Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646088 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerDied","Data":"f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d"} Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646521 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerDied","Data":"cd7228a0a9e2f1de392c2bb43fd16881e4089be661cb1c959772224b90b20a04"} Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646535 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerDied","Data":"5363bc3628edec075ea6023d3f89fdfb511a4993064bde2b10f9c3b24d9013f1"} Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646709 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:06:58 crc kubenswrapper[4891]: I0929 10:06:58.646744 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.229137 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.231418 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.276520 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.315343 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.678497 4891 generic.go:334] "Generic (PLEG): container finished" podID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerID="ae592288fee06d9462ebe2ea7036637b6e893a3439821c26330cc1274241da3b" exitCode=0 Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.678605 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerDied","Data":"ae592288fee06d9462ebe2ea7036637b6e893a3439821c26330cc1274241da3b"} Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.679122 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:07:00 crc kubenswrapper[4891]: I0929 10:07:00.679563 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:07:01 crc kubenswrapper[4891]: I0929 10:07:01.048515 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:07:01 crc kubenswrapper[4891]: I0929 10:07:01.048901 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:07:01 crc kubenswrapper[4891]: I0929 10:07:01.107003 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:07:02 crc kubenswrapper[4891]: I0929 10:07:02.705035 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:07:02 crc kubenswrapper[4891]: I0929 10:07:02.706845 4891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:07:02 crc kubenswrapper[4891]: I0929 10:07:02.880728 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:07:02 crc kubenswrapper[4891]: I0929 10:07:02.970052 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:07:03 crc kubenswrapper[4891]: I0929 10:07:03.180044 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.591017 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.687887 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.687937 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4b5\" (UniqueName: \"kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.688010 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.688051 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.688081 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.688144 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.688167 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts\") pod \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\" (UID: \"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31\") " Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.689585 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.693603 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5" (OuterVolumeSpecName: "kube-api-access-td4b5") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "kube-api-access-td4b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.695101 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts" (OuterVolumeSpecName: "scripts") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.695165 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.722399 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.743686 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" event={"ID":"2f8acd86-ee16-42c5-9309-7651699a0886","Type":"ContainerStarted","Data":"2d7ac500e158e103dd51ba49af2b87c5f48717820109d2f4561edaa95d5f6c12"} Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.749028 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5a3896b-fa69-42c2-bf4f-fa7139fb0c31","Type":"ContainerDied","Data":"d9fbf0d3d21573fc92017f2f3bb6ac688895acb2fb0c3e9cdfd9fb314c993a0a"} Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.749243 4891 scope.go:117] "RemoveContainer" containerID="f23dd9c069fb51194d77c4a34e59a6bbbea406030949298a048cda1cf7fb4c3d" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.749135 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.774081 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" podStartSLOduration=1.892951284 podStartE2EDuration="11.774058547s" podCreationTimestamp="2025-09-29 10:06:54 +0000 UTC" firstStartedPulling="2025-09-29 10:06:55.422818293 +0000 UTC m=+1145.627986614" lastFinishedPulling="2025-09-29 10:07:05.303925556 +0000 UTC m=+1155.509093877" observedRunningTime="2025-09-29 10:07:05.760044262 +0000 UTC m=+1155.965212583" watchObservedRunningTime="2025-09-29 10:07:05.774058547 +0000 UTC m=+1155.979226868" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.775957 4891 scope.go:117] "RemoveContainer" containerID="cd7228a0a9e2f1de392c2bb43fd16881e4089be661cb1c959772224b90b20a04" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.790828 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.790865 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.790881 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.790892 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.790903 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4b5\" (UniqueName: \"kubernetes.io/projected/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-kube-api-access-td4b5\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.791425 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.799670 4891 scope.go:117] "RemoveContainer" containerID="5363bc3628edec075ea6023d3f89fdfb511a4993064bde2b10f9c3b24d9013f1" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.809425 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data" (OuterVolumeSpecName: "config-data") pod "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" (UID: "e5a3896b-fa69-42c2-bf4f-fa7139fb0c31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.820517 4891 scope.go:117] "RemoveContainer" containerID="ae592288fee06d9462ebe2ea7036637b6e893a3439821c26330cc1274241da3b" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.892568 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:05 crc kubenswrapper[4891]: I0929 10:07:05.892887 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.085406 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.095995 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.114636 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:06 crc kubenswrapper[4891]: E0929 10:07:06.115099 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="sg-core" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115122 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="sg-core" Sep 29 10:07:06 crc kubenswrapper[4891]: E0929 10:07:06.115137 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-notification-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115143 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-notification-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: E0929 10:07:06.115153 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="proxy-httpd" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115159 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="proxy-httpd" Sep 29 10:07:06 crc kubenswrapper[4891]: E0929 10:07:06.115190 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-central-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115198 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-central-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115371 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="proxy-httpd" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115385 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="sg-core" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115413 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-notification-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.115428 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" containerName="ceilometer-central-agent" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.120804 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.125227 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.127019 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.137347 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.186513 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.186602 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199295 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199468 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199599 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199626 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199887 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82q4g\" (UniqueName: \"kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.199928 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.200047 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302018 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82q4g\" (UniqueName: \"kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302078 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302135 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302200 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302254 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302306 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302328 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.302783 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.303243 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.307706 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.307911 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.308296 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.308631 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.325330 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82q4g\" (UniqueName: \"kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g\") pod \"ceilometer-0\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.413959 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a3896b-fa69-42c2-bf4f-fa7139fb0c31" path="/var/lib/kubelet/pods/e5a3896b-fa69-42c2-bf4f-fa7139fb0c31/volumes" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.460638 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:06 crc kubenswrapper[4891]: I0929 10:07:06.946521 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:06 crc kubenswrapper[4891]: W0929 10:07:06.950735 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08fe93db_9a9c_4b65_9222_ecacc91840e4.slice/crio-706c072766c6cebe90f928c264cb0ee4e8eb0bd01adc0572bc92ac5dc70416ef WatchSource:0}: Error finding container 706c072766c6cebe90f928c264cb0ee4e8eb0bd01adc0572bc92ac5dc70416ef: Status 404 returned error can't find the container with id 706c072766c6cebe90f928c264cb0ee4e8eb0bd01adc0572bc92ac5dc70416ef Sep 29 10:07:07 crc kubenswrapper[4891]: I0929 10:07:07.770495 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerStarted","Data":"706c072766c6cebe90f928c264cb0ee4e8eb0bd01adc0572bc92ac5dc70416ef"} Sep 29 10:07:08 crc kubenswrapper[4891]: I0929 10:07:08.784261 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerStarted","Data":"cb01fdc79d18ff3b2bf4a06e8acbd046ef69ccbbd6951c26fd61d5ee795d04e9"} Sep 29 10:07:09 crc kubenswrapper[4891]: I0929 10:07:09.795232 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerStarted","Data":"81618d8d75a378a90b2c13f4863a758f3b3b619e7bf4d8c760675b36624013ad"} Sep 29 10:07:10 crc kubenswrapper[4891]: I0929 10:07:10.804638 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerStarted","Data":"9bf37b50943e5b45a329cbdbceece2052cfbb28ba315b8c10193ec4564bc72aa"} Sep 29 10:07:12 crc kubenswrapper[4891]: I0929 10:07:12.825915 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerStarted","Data":"9e6472b37a3578debf9ee32dfba549a176bf5f837c8774827abaa3a19bc3a51b"} Sep 29 10:07:12 crc kubenswrapper[4891]: I0929 10:07:12.826428 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.255646 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.408124365 podStartE2EDuration="9.255622805s" podCreationTimestamp="2025-09-29 10:07:06 +0000 UTC" firstStartedPulling="2025-09-29 10:07:06.953972544 +0000 UTC m=+1157.159140865" lastFinishedPulling="2025-09-29 10:07:11.801470984 +0000 UTC m=+1162.006639305" observedRunningTime="2025-09-29 10:07:12.853277504 +0000 UTC m=+1163.058445845" watchObservedRunningTime="2025-09-29 10:07:15.255622805 +0000 UTC m=+1165.460791136" Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.257400 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.257693 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-central-agent" containerID="cri-o://cb01fdc79d18ff3b2bf4a06e8acbd046ef69ccbbd6951c26fd61d5ee795d04e9" gracePeriod=30 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.257725 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="sg-core" containerID="cri-o://9bf37b50943e5b45a329cbdbceece2052cfbb28ba315b8c10193ec4564bc72aa" gracePeriod=30 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.257756 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-notification-agent" containerID="cri-o://81618d8d75a378a90b2c13f4863a758f3b3b619e7bf4d8c760675b36624013ad" gracePeriod=30 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.257783 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="proxy-httpd" containerID="cri-o://9e6472b37a3578debf9ee32dfba549a176bf5f837c8774827abaa3a19bc3a51b" gracePeriod=30 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864367 4891 generic.go:334] "Generic (PLEG): container finished" podID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerID="9e6472b37a3578debf9ee32dfba549a176bf5f837c8774827abaa3a19bc3a51b" exitCode=0 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864401 4891 generic.go:334] "Generic (PLEG): container finished" podID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerID="9bf37b50943e5b45a329cbdbceece2052cfbb28ba315b8c10193ec4564bc72aa" exitCode=2 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864409 4891 generic.go:334] "Generic (PLEG): container finished" podID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerID="81618d8d75a378a90b2c13f4863a758f3b3b619e7bf4d8c760675b36624013ad" exitCode=0 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864417 4891 generic.go:334] "Generic (PLEG): container finished" podID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerID="cb01fdc79d18ff3b2bf4a06e8acbd046ef69ccbbd6951c26fd61d5ee795d04e9" exitCode=0 Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864439 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerDied","Data":"9e6472b37a3578debf9ee32dfba549a176bf5f837c8774827abaa3a19bc3a51b"} Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864465 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerDied","Data":"9bf37b50943e5b45a329cbdbceece2052cfbb28ba315b8c10193ec4564bc72aa"} Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864474 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerDied","Data":"81618d8d75a378a90b2c13f4863a758f3b3b619e7bf4d8c760675b36624013ad"} Sep 29 10:07:15 crc kubenswrapper[4891]: I0929 10:07:15.864483 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerDied","Data":"cb01fdc79d18ff3b2bf4a06e8acbd046ef69ccbbd6951c26fd61d5ee795d04e9"} Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.088090 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200022 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200120 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82q4g\" (UniqueName: \"kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200195 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200234 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200442 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200511 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.200575 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data\") pod \"08fe93db-9a9c-4b65-9222-ecacc91840e4\" (UID: \"08fe93db-9a9c-4b65-9222-ecacc91840e4\") " Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.201323 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.201510 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.201647 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.207964 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g" (OuterVolumeSpecName: "kube-api-access-82q4g") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "kube-api-access-82q4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.209009 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts" (OuterVolumeSpecName: "scripts") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.235559 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.295351 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.304831 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.304903 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08fe93db-9a9c-4b65-9222-ecacc91840e4-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.304926 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82q4g\" (UniqueName: \"kubernetes.io/projected/08fe93db-9a9c-4b65-9222-ecacc91840e4-kube-api-access-82q4g\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.304949 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.305604 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.308438 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data" (OuterVolumeSpecName: "config-data") pod "08fe93db-9a9c-4b65-9222-ecacc91840e4" (UID: "08fe93db-9a9c-4b65-9222-ecacc91840e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.407696 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe93db-9a9c-4b65-9222-ecacc91840e4-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.888422 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08fe93db-9a9c-4b65-9222-ecacc91840e4","Type":"ContainerDied","Data":"706c072766c6cebe90f928c264cb0ee4e8eb0bd01adc0572bc92ac5dc70416ef"} Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.888482 4891 scope.go:117] "RemoveContainer" containerID="9e6472b37a3578debf9ee32dfba549a176bf5f837c8774827abaa3a19bc3a51b" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.888584 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.913591 4891 scope.go:117] "RemoveContainer" containerID="9bf37b50943e5b45a329cbdbceece2052cfbb28ba315b8c10193ec4564bc72aa" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.925495 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.938450 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.939957 4891 scope.go:117] "RemoveContainer" containerID="81618d8d75a378a90b2c13f4863a758f3b3b619e7bf4d8c760675b36624013ad" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.947307 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:16 crc kubenswrapper[4891]: E0929 10:07:16.948925 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-notification-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.948954 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-notification-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: E0929 10:07:16.948990 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="proxy-httpd" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.948998 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="proxy-httpd" Sep 29 10:07:16 crc kubenswrapper[4891]: E0929 10:07:16.949006 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-central-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949014 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-central-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: E0929 10:07:16.949045 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="sg-core" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949052 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="sg-core" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949361 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="proxy-httpd" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949401 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-central-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949416 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="ceilometer-notification-agent" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.949423 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" containerName="sg-core" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.951780 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.959137 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.992148 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:07:16 crc kubenswrapper[4891]: I0929 10:07:16.992414 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.010029 4891 scope.go:117] "RemoveContainer" containerID="cb01fdc79d18ff3b2bf4a06e8acbd046ef69ccbbd6951c26fd61d5ee795d04e9" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.020505 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.020734 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.020915 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.020983 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.021005 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.021042 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.021072 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c4d\" (UniqueName: \"kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122650 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122720 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122743 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122802 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122837 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c4d\" (UniqueName: \"kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122871 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.122923 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.123256 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.123605 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.127221 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.127227 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.128823 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.136697 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.144391 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c4d\" (UniqueName: \"kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d\") pod \"ceilometer-0\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.303519 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.570197 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:17 crc kubenswrapper[4891]: I0929 10:07:17.898978 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerStarted","Data":"d413473fd4b7638978f1cfb440e886aa45667edcec269c5c7f388be23b87d092"} Sep 29 10:07:18 crc kubenswrapper[4891]: I0929 10:07:18.408561 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fe93db-9a9c-4b65-9222-ecacc91840e4" path="/var/lib/kubelet/pods/08fe93db-9a9c-4b65-9222-ecacc91840e4/volumes" Sep 29 10:07:18 crc kubenswrapper[4891]: I0929 10:07:18.912811 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerStarted","Data":"339acbfb5c30a915d3004546d99cfc683ed475fc630608e8ba11344b6a5885ef"} Sep 29 10:07:18 crc kubenswrapper[4891]: I0929 10:07:18.915066 4891 generic.go:334] "Generic (PLEG): container finished" podID="2f8acd86-ee16-42c5-9309-7651699a0886" containerID="2d7ac500e158e103dd51ba49af2b87c5f48717820109d2f4561edaa95d5f6c12" exitCode=0 Sep 29 10:07:18 crc kubenswrapper[4891]: I0929 10:07:18.915103 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" event={"ID":"2f8acd86-ee16-42c5-9309-7651699a0886","Type":"ContainerDied","Data":"2d7ac500e158e103dd51ba49af2b87c5f48717820109d2f4561edaa95d5f6c12"} Sep 29 10:07:19 crc kubenswrapper[4891]: I0929 10:07:19.927156 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerStarted","Data":"b9eaa9a629e040af97ca11582be3d9cf24446d5ece244bfc30c02eacdf476759"} Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.334940 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.499931 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data\") pod \"2f8acd86-ee16-42c5-9309-7651699a0886\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.501185 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts\") pod \"2f8acd86-ee16-42c5-9309-7651699a0886\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.501588 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr5n\" (UniqueName: \"kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n\") pod \"2f8acd86-ee16-42c5-9309-7651699a0886\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.502001 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle\") pod \"2f8acd86-ee16-42c5-9309-7651699a0886\" (UID: \"2f8acd86-ee16-42c5-9309-7651699a0886\") " Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.517127 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n" (OuterVolumeSpecName: "kube-api-access-fcr5n") pod "2f8acd86-ee16-42c5-9309-7651699a0886" (UID: "2f8acd86-ee16-42c5-9309-7651699a0886"). InnerVolumeSpecName "kube-api-access-fcr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.519468 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts" (OuterVolumeSpecName: "scripts") pod "2f8acd86-ee16-42c5-9309-7651699a0886" (UID: "2f8acd86-ee16-42c5-9309-7651699a0886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.534915 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data" (OuterVolumeSpecName: "config-data") pod "2f8acd86-ee16-42c5-9309-7651699a0886" (UID: "2f8acd86-ee16-42c5-9309-7651699a0886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.537480 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f8acd86-ee16-42c5-9309-7651699a0886" (UID: "2f8acd86-ee16-42c5-9309-7651699a0886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.607522 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr5n\" (UniqueName: \"kubernetes.io/projected/2f8acd86-ee16-42c5-9309-7651699a0886-kube-api-access-fcr5n\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.607554 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.607564 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.607575 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8acd86-ee16-42c5-9309-7651699a0886-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.938606 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" event={"ID":"2f8acd86-ee16-42c5-9309-7651699a0886","Type":"ContainerDied","Data":"5bad63b465fb9b6045ae9603e7e816428de3aecdf941fd988b8507a1c582601a"} Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.938664 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bad63b465fb9b6045ae9603e7e816428de3aecdf941fd988b8507a1c582601a" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.938739 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jt2hn" Sep 29 10:07:20 crc kubenswrapper[4891]: I0929 10:07:20.944969 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerStarted","Data":"ca8c3746eb0706dbde359a9bf23b29422f3ba18bdcc369e349d97d6dc1eecf24"} Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.125345 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:07:21 crc kubenswrapper[4891]: E0929 10:07:21.126543 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8acd86-ee16-42c5-9309-7651699a0886" containerName="nova-cell0-conductor-db-sync" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.126565 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8acd86-ee16-42c5-9309-7651699a0886" containerName="nova-cell0-conductor-db-sync" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.126949 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8acd86-ee16-42c5-9309-7651699a0886" containerName="nova-cell0-conductor-db-sync" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.127833 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.159955 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ps2k7" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.160345 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.167916 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.323105 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.323284 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.323425 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftjx\" (UniqueName: \"kubernetes.io/projected/01cf2585-dd79-4154-8567-2c24dee11709-kube-api-access-6ftjx\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.425183 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.425359 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftjx\" (UniqueName: \"kubernetes.io/projected/01cf2585-dd79-4154-8567-2c24dee11709-kube-api-access-6ftjx\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.425439 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.437743 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.437754 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cf2585-dd79-4154-8567-2c24dee11709-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.452090 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftjx\" (UniqueName: \"kubernetes.io/projected/01cf2585-dd79-4154-8567-2c24dee11709-kube-api-access-6ftjx\") pod \"nova-cell0-conductor-0\" (UID: \"01cf2585-dd79-4154-8567-2c24dee11709\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.479107 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.806817 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.821281 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.954694 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"01cf2585-dd79-4154-8567-2c24dee11709","Type":"ContainerStarted","Data":"c5ca2fcd0a0b484683f51edb530dd8fd950f8b44c0a48f609c43b595076de994"} Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.959597 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerStarted","Data":"83d1f3977ce6fa274eb7b28120ff27e9f06ce8eefb41e86207e84443ece6f2b3"} Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.959775 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-central-agent" containerID="cri-o://339acbfb5c30a915d3004546d99cfc683ed475fc630608e8ba11344b6a5885ef" gracePeriod=30 Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.960093 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.960348 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="proxy-httpd" containerID="cri-o://83d1f3977ce6fa274eb7b28120ff27e9f06ce8eefb41e86207e84443ece6f2b3" gracePeriod=30 Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.960398 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="sg-core" containerID="cri-o://ca8c3746eb0706dbde359a9bf23b29422f3ba18bdcc369e349d97d6dc1eecf24" gracePeriod=30 Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.960431 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-notification-agent" containerID="cri-o://b9eaa9a629e040af97ca11582be3d9cf24446d5ece244bfc30c02eacdf476759" gracePeriod=30 Sep 29 10:07:21 crc kubenswrapper[4891]: I0929 10:07:21.991991 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.92390956 podStartE2EDuration="5.991967385s" podCreationTimestamp="2025-09-29 10:07:16 +0000 UTC" firstStartedPulling="2025-09-29 10:07:17.591756155 +0000 UTC m=+1167.796924486" lastFinishedPulling="2025-09-29 10:07:21.659814 +0000 UTC m=+1171.864982311" observedRunningTime="2025-09-29 10:07:21.985266954 +0000 UTC m=+1172.190435285" watchObservedRunningTime="2025-09-29 10:07:21.991967385 +0000 UTC m=+1172.197135706" Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.972347 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"01cf2585-dd79-4154-8567-2c24dee11709","Type":"ContainerStarted","Data":"0a2d48e24f4f9f2c459928da48ce5980d4663923dc59ff13dc6afdc045c69061"} Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.972478 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.975964 4891 generic.go:334] "Generic (PLEG): container finished" podID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerID="ca8c3746eb0706dbde359a9bf23b29422f3ba18bdcc369e349d97d6dc1eecf24" exitCode=2 Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.975991 4891 generic.go:334] "Generic (PLEG): container finished" podID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerID="b9eaa9a629e040af97ca11582be3d9cf24446d5ece244bfc30c02eacdf476759" exitCode=0 Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.976001 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerDied","Data":"ca8c3746eb0706dbde359a9bf23b29422f3ba18bdcc369e349d97d6dc1eecf24"} Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.976036 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerDied","Data":"b9eaa9a629e040af97ca11582be3d9cf24446d5ece244bfc30c02eacdf476759"} Sep 29 10:07:22 crc kubenswrapper[4891]: I0929 10:07:22.995530 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9955135290000001 podStartE2EDuration="1.995513529s" podCreationTimestamp="2025-09-29 10:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:22.994193691 +0000 UTC m=+1173.199362012" watchObservedRunningTime="2025-09-29 10:07:22.995513529 +0000 UTC m=+1173.200681850" Sep 29 10:07:26 crc kubenswrapper[4891]: I0929 10:07:26.512233 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.042650 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-d7m4m"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.044421 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.046912 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.050146 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.055295 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d7m4m"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.148147 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.148280 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbqfw\" (UniqueName: \"kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.148346 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.148553 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.249480 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.250249 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.250437 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.250497 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.250621 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbqfw\" (UniqueName: \"kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.257435 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.268930 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.273989 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.295647 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.300618 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.300702 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.305671 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.310392 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.310647 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbqfw\" (UniqueName: \"kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw\") pod \"nova-cell0-cell-mapping-d7m4m\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.313468 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.340861 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.386983 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.388500 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.393087 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.433859 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.450930 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457598 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n742m\" (UniqueName: \"kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457659 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cllxb\" (UniqueName: \"kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457732 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457759 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457803 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457825 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.457853 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.458005 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.459018 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.461816 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.472327 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559457 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559554 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559591 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjpg\" (UniqueName: \"kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559616 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559640 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559658 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559677 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559705 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559729 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n742m\" (UniqueName: \"kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559747 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cllxb\" (UniqueName: \"kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559778 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m95xl\" (UniqueName: \"kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559852 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559867 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559886 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559904 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559919 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.559935 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.561154 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.565194 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.566331 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.575633 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.578237 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.581692 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.581764 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.583043 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.597207 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.597769 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.602331 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cllxb\" (UniqueName: \"kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb\") pod \"nova-metadata-0\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.611613 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n742m\" (UniqueName: \"kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m\") pod \"nova-api-0\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669165 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjpg\" (UniqueName: \"kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669234 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669266 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669287 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669307 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669343 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669398 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m95xl\" (UniqueName: \"kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669432 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.669488 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.670416 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.670439 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.674070 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.674406 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.674475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.688558 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.691094 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjpg\" (UniqueName: \"kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.695542 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m95xl\" (UniqueName: \"kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl\") pod \"dnsmasq-dns-865f5d856f-hb5lh\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.697047 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data\") pod \"nova-scheduler-0\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.700172 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.725198 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.736002 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.776528 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.776804 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.776874 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwq6k\" (UniqueName: \"kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.794404 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.878779 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.878900 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.879002 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwq6k\" (UniqueName: \"kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.890530 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.894350 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.904725 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwq6k\" (UniqueName: \"kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k\") pod \"nova-cell1-novncproxy-0\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:27 crc kubenswrapper[4891]: I0929 10:07:27.946088 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.043315 4891 generic.go:334] "Generic (PLEG): container finished" podID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerID="339acbfb5c30a915d3004546d99cfc683ed475fc630608e8ba11344b6a5885ef" exitCode=0 Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.043385 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerDied","Data":"339acbfb5c30a915d3004546d99cfc683ed475fc630608e8ba11344b6a5885ef"} Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.049088 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d7m4m"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.240298 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kq9cj"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.241743 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.246761 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.247074 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.268846 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kq9cj"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.286828 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.398637 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.399197 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.399247 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gdqg\" (UniqueName: \"kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.399287 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.443442 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.502936 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.503026 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gdqg\" (UniqueName: \"kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.503085 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.503156 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.510713 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.511405 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.515338 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.521622 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gdqg\" (UniqueName: \"kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg\") pod \"nova-cell1-conductor-db-sync-kq9cj\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.524776 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:28 crc kubenswrapper[4891]: W0929 10:07:28.535122 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf2b7a9_9892_47f9_b290_775806be28cc.slice/crio-bad59f544f0d7a8cf8c1d42abd134a70827b3835f202344d0cdba539d30fb7d9 WatchSource:0}: Error finding container bad59f544f0d7a8cf8c1d42abd134a70827b3835f202344d0cdba539d30fb7d9: Status 404 returned error can't find the container with id bad59f544f0d7a8cf8c1d42abd134a70827b3835f202344d0cdba539d30fb7d9 Sep 29 10:07:28 crc kubenswrapper[4891]: W0929 10:07:28.653742 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbdfc2d8_bf46_4c65_91d8_a6dbe38d99d4.slice/crio-fb2e92c62b0ed0fa59021e17c9f1e50c03ea5db71fe71fbf00212cd6db7e45a9 WatchSource:0}: Error finding container fb2e92c62b0ed0fa59021e17c9f1e50c03ea5db71fe71fbf00212cd6db7e45a9: Status 404 returned error can't find the container with id fb2e92c62b0ed0fa59021e17c9f1e50c03ea5db71fe71fbf00212cd6db7e45a9 Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.654540 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.668058 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:28 crc kubenswrapper[4891]: W0929 10:07:28.675780 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda94a9f15_db51_47f4_9456_5c64e83e413f.slice/crio-7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4 WatchSource:0}: Error finding container 7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4: Status 404 returned error can't find the container with id 7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4 Sep 29 10:07:28 crc kubenswrapper[4891]: I0929 10:07:28.676865 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.054733 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3cf2b7a9-9892-47f9-b290-775806be28cc","Type":"ContainerStarted","Data":"bad59f544f0d7a8cf8c1d42abd134a70827b3835f202344d0cdba539d30fb7d9"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.056566 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94a9f15-db51-47f4-9456-5c64e83e413f","Type":"ContainerStarted","Data":"7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.058484 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerStarted","Data":"2c03290a437a656cb39b45074eddcc2cadcc552334ac2f289abc81902984accb"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.061731 4891 generic.go:334] "Generic (PLEG): container finished" podID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerID="af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533" exitCode=0 Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.061816 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" event={"ID":"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4","Type":"ContainerDied","Data":"af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.061838 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" event={"ID":"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4","Type":"ContainerStarted","Data":"fb2e92c62b0ed0fa59021e17c9f1e50c03ea5db71fe71fbf00212cd6db7e45a9"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.067275 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerStarted","Data":"c993df9e94c364801b623666da640b1280be642f797515901293b5d4ae97d138"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.074964 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d7m4m" event={"ID":"b17acc18-5c95-4d7c-9576-ba976472f02d","Type":"ContainerStarted","Data":"2d45602c0835c15b231030f2d961822d224fdeb285a557b60e5f0d9bf637b68c"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.075019 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d7m4m" event={"ID":"b17acc18-5c95-4d7c-9576-ba976472f02d","Type":"ContainerStarted","Data":"0e26d7ae513f42d733feea5cfda1b92cd43360888a4ec46ea5c4371832ac4198"} Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.105632 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-d7m4m" podStartSLOduration=2.10560892 podStartE2EDuration="2.10560892s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:29.103581222 +0000 UTC m=+1179.308749543" watchObservedRunningTime="2025-09-29 10:07:29.10560892 +0000 UTC m=+1179.310777241" Sep 29 10:07:29 crc kubenswrapper[4891]: I0929 10:07:29.182607 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kq9cj"] Sep 29 10:07:30 crc kubenswrapper[4891]: I0929 10:07:30.089472 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" event={"ID":"8c8148fa-ad96-4f4b-8910-7233808ce733","Type":"ContainerStarted","Data":"c9c9b0e6401067a374ff2dd6c76854ccab5f1fcd5c405c04588ebd8e17ef8ea4"} Sep 29 10:07:30 crc kubenswrapper[4891]: I0929 10:07:30.090169 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" event={"ID":"8c8148fa-ad96-4f4b-8910-7233808ce733","Type":"ContainerStarted","Data":"395c5c36adf7c502a73d4c55d43ed9396227291a48784b9a0b2ea8c5ad23f822"} Sep 29 10:07:30 crc kubenswrapper[4891]: I0929 10:07:30.096837 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" event={"ID":"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4","Type":"ContainerStarted","Data":"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c"} Sep 29 10:07:30 crc kubenswrapper[4891]: I0929 10:07:30.113064 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" podStartSLOduration=2.113036365 podStartE2EDuration="2.113036365s" podCreationTimestamp="2025-09-29 10:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:30.1072602 +0000 UTC m=+1180.312428541" watchObservedRunningTime="2025-09-29 10:07:30.113036365 +0000 UTC m=+1180.318204696" Sep 29 10:07:30 crc kubenswrapper[4891]: I0929 10:07:30.130269 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" podStartSLOduration=3.130249715 podStartE2EDuration="3.130249715s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:30.128070993 +0000 UTC m=+1180.333239334" watchObservedRunningTime="2025-09-29 10:07:30.130249715 +0000 UTC m=+1180.335418036" Sep 29 10:07:31 crc kubenswrapper[4891]: I0929 10:07:31.024638 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:07:31 crc kubenswrapper[4891]: I0929 10:07:31.035880 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:31 crc kubenswrapper[4891]: I0929 10:07:31.109833 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.130514 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3cf2b7a9-9892-47f9-b290-775806be28cc","Type":"ContainerStarted","Data":"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3"} Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.132498 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94a9f15-db51-47f4-9456-5c64e83e413f","Type":"ContainerStarted","Data":"3ab23e15bdc4bf422e11da792917c1ceac6fdb7285179e467711afac5b806ec0"} Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.132596 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a94a9f15-db51-47f4-9456-5c64e83e413f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3ab23e15bdc4bf422e11da792917c1ceac6fdb7285179e467711afac5b806ec0" gracePeriod=30 Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.138001 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerStarted","Data":"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a"} Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.138047 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerStarted","Data":"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399"} Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.140942 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerStarted","Data":"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690"} Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.157265 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.283877688 podStartE2EDuration="6.157245626s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="2025-09-29 10:07:28.544940844 +0000 UTC m=+1178.750109175" lastFinishedPulling="2025-09-29 10:07:32.418308792 +0000 UTC m=+1182.623477113" observedRunningTime="2025-09-29 10:07:33.149098624 +0000 UTC m=+1183.354266945" watchObservedRunningTime="2025-09-29 10:07:33.157245626 +0000 UTC m=+1183.362413947" Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.180983 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.4424535880000002 podStartE2EDuration="6.180942562s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="2025-09-29 10:07:28.692855219 +0000 UTC m=+1178.898023540" lastFinishedPulling="2025-09-29 10:07:32.431344193 +0000 UTC m=+1182.636512514" observedRunningTime="2025-09-29 10:07:33.176278729 +0000 UTC m=+1183.381447060" watchObservedRunningTime="2025-09-29 10:07:33.180942562 +0000 UTC m=+1183.386110893" Sep 29 10:07:33 crc kubenswrapper[4891]: I0929 10:07:33.201489 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.059854865 podStartE2EDuration="6.201470796s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="2025-09-29 10:07:28.276686051 +0000 UTC m=+1178.481854362" lastFinishedPulling="2025-09-29 10:07:32.418301962 +0000 UTC m=+1182.623470293" observedRunningTime="2025-09-29 10:07:33.197863854 +0000 UTC m=+1183.403032175" watchObservedRunningTime="2025-09-29 10:07:33.201470796 +0000 UTC m=+1183.406639117" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.153616 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerStarted","Data":"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b"} Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.154114 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-log" containerID="cri-o://b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" gracePeriod=30 Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.154285 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-metadata" containerID="cri-o://23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" gracePeriod=30 Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.183862 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.221572757 podStartE2EDuration="7.183835188s" podCreationTimestamp="2025-09-29 10:07:27 +0000 UTC" firstStartedPulling="2025-09-29 10:07:28.4560031 +0000 UTC m=+1178.661171421" lastFinishedPulling="2025-09-29 10:07:32.418265531 +0000 UTC m=+1182.623433852" observedRunningTime="2025-09-29 10:07:34.179446343 +0000 UTC m=+1184.384614664" watchObservedRunningTime="2025-09-29 10:07:34.183835188 +0000 UTC m=+1184.389003509" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.779485 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.883614 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs\") pod \"86fdb604-1923-4ae3-8773-92b58196f1c7\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.884161 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cllxb\" (UniqueName: \"kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb\") pod \"86fdb604-1923-4ae3-8773-92b58196f1c7\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.884174 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs" (OuterVolumeSpecName: "logs") pod "86fdb604-1923-4ae3-8773-92b58196f1c7" (UID: "86fdb604-1923-4ae3-8773-92b58196f1c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.884396 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data\") pod \"86fdb604-1923-4ae3-8773-92b58196f1c7\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.884456 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle\") pod \"86fdb604-1923-4ae3-8773-92b58196f1c7\" (UID: \"86fdb604-1923-4ae3-8773-92b58196f1c7\") " Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.885192 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fdb604-1923-4ae3-8773-92b58196f1c7-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.893908 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb" (OuterVolumeSpecName: "kube-api-access-cllxb") pod "86fdb604-1923-4ae3-8773-92b58196f1c7" (UID: "86fdb604-1923-4ae3-8773-92b58196f1c7"). InnerVolumeSpecName "kube-api-access-cllxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.916176 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data" (OuterVolumeSpecName: "config-data") pod "86fdb604-1923-4ae3-8773-92b58196f1c7" (UID: "86fdb604-1923-4ae3-8773-92b58196f1c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.922141 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86fdb604-1923-4ae3-8773-92b58196f1c7" (UID: "86fdb604-1923-4ae3-8773-92b58196f1c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.987183 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cllxb\" (UniqueName: \"kubernetes.io/projected/86fdb604-1923-4ae3-8773-92b58196f1c7-kube-api-access-cllxb\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.987231 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:34 crc kubenswrapper[4891]: I0929 10:07:34.987246 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fdb604-1923-4ae3-8773-92b58196f1c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166289 4891 generic.go:334] "Generic (PLEG): container finished" podID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerID="23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" exitCode=0 Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166337 4891 generic.go:334] "Generic (PLEG): container finished" podID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerID="b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" exitCode=143 Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166370 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerDied","Data":"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b"} Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166387 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166415 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerDied","Data":"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690"} Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166430 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86fdb604-1923-4ae3-8773-92b58196f1c7","Type":"ContainerDied","Data":"c993df9e94c364801b623666da640b1280be642f797515901293b5d4ae97d138"} Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.166451 4891 scope.go:117] "RemoveContainer" containerID="23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.189836 4891 scope.go:117] "RemoveContainer" containerID="b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.218187 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.218765 4891 scope.go:117] "RemoveContainer" containerID="23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" Sep 29 10:07:35 crc kubenswrapper[4891]: E0929 10:07:35.219572 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b\": container with ID starting with 23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b not found: ID does not exist" containerID="23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.219645 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b"} err="failed to get container status \"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b\": rpc error: code = NotFound desc = could not find container \"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b\": container with ID starting with 23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b not found: ID does not exist" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.219686 4891 scope.go:117] "RemoveContainer" containerID="b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" Sep 29 10:07:35 crc kubenswrapper[4891]: E0929 10:07:35.220158 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690\": container with ID starting with b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690 not found: ID does not exist" containerID="b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.220195 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690"} err="failed to get container status \"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690\": rpc error: code = NotFound desc = could not find container \"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690\": container with ID starting with b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690 not found: ID does not exist" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.220223 4891 scope.go:117] "RemoveContainer" containerID="23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.220565 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b"} err="failed to get container status \"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b\": rpc error: code = NotFound desc = could not find container \"23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b\": container with ID starting with 23aa19bd994ebd73490aa2ef6cf2efbe5d71dc6283bcbd2d962f315d6ebfd82b not found: ID does not exist" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.220599 4891 scope.go:117] "RemoveContainer" containerID="b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.221110 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690"} err="failed to get container status \"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690\": rpc error: code = NotFound desc = could not find container \"b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690\": container with ID starting with b8e7001c19d74a38c66d1e664f57bc4d7be0958c32a069bc9df5a633fbdae690 not found: ID does not exist" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.227782 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.251200 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:35 crc kubenswrapper[4891]: E0929 10:07:35.255817 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-metadata" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.255853 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-metadata" Sep 29 10:07:35 crc kubenswrapper[4891]: E0929 10:07:35.255898 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-log" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.255905 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-log" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.256096 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-log" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.256108 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" containerName="nova-metadata-metadata" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.257370 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.259887 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.260021 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.264146 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.293286 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.293365 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.293573 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.293639 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.293866 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgwq\" (UniqueName: \"kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.396282 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.396500 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.396546 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.396730 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgwq\" (UniqueName: \"kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.396954 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.398138 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.401202 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.404340 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.416919 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.430615 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgwq\" (UniqueName: \"kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq\") pod \"nova-metadata-0\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " pod="openstack/nova-metadata-0" Sep 29 10:07:35 crc kubenswrapper[4891]: I0929 10:07:35.591443 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:36 crc kubenswrapper[4891]: I0929 10:07:36.131217 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:36 crc kubenswrapper[4891]: W0929 10:07:36.133952 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15eac9a5_90a2_49b9_9bdd_0a6074b5e4be.slice/crio-437e08cb26292bf1e6d22c64904af4ad1821828435827743513606a4a452ab39 WatchSource:0}: Error finding container 437e08cb26292bf1e6d22c64904af4ad1821828435827743513606a4a452ab39: Status 404 returned error can't find the container with id 437e08cb26292bf1e6d22c64904af4ad1821828435827743513606a4a452ab39 Sep 29 10:07:36 crc kubenswrapper[4891]: I0929 10:07:36.178602 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerStarted","Data":"437e08cb26292bf1e6d22c64904af4ad1821828435827743513606a4a452ab39"} Sep 29 10:07:36 crc kubenswrapper[4891]: I0929 10:07:36.186317 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:07:36 crc kubenswrapper[4891]: I0929 10:07:36.186589 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:07:36 crc kubenswrapper[4891]: I0929 10:07:36.412064 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86fdb604-1923-4ae3-8773-92b58196f1c7" path="/var/lib/kubelet/pods/86fdb604-1923-4ae3-8773-92b58196f1c7/volumes" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.192671 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerStarted","Data":"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c"} Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.193046 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerStarted","Data":"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da"} Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.194965 4891 generic.go:334] "Generic (PLEG): container finished" podID="b17acc18-5c95-4d7c-9576-ba976472f02d" containerID="2d45602c0835c15b231030f2d961822d224fdeb285a557b60e5f0d9bf637b68c" exitCode=0 Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.195029 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d7m4m" event={"ID":"b17acc18-5c95-4d7c-9576-ba976472f02d","Type":"ContainerDied","Data":"2d45602c0835c15b231030f2d961822d224fdeb285a557b60e5f0d9bf637b68c"} Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.220045 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.220020711 podStartE2EDuration="2.220020711s" podCreationTimestamp="2025-09-29 10:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:37.210282113 +0000 UTC m=+1187.415450464" watchObservedRunningTime="2025-09-29 10:07:37.220020711 +0000 UTC m=+1187.425189042" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.701663 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.701719 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.736208 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.736260 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.775037 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.797127 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.879171 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.879729 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="dnsmasq-dns" containerID="cri-o://4299840af8eb34673b5ae9dec18e4b1d4b83b14a3079ad95a1d68ef73db2207b" gracePeriod=10 Sep 29 10:07:37 crc kubenswrapper[4891]: I0929 10:07:37.946253 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.205889 4891 generic.go:334] "Generic (PLEG): container finished" podID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerID="4299840af8eb34673b5ae9dec18e4b1d4b83b14a3079ad95a1d68ef73db2207b" exitCode=0 Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.205963 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" event={"ID":"27917a33-23af-42a4-a85c-3bf8f1f9c1d0","Type":"ContainerDied","Data":"4299840af8eb34673b5ae9dec18e4b1d4b83b14a3079ad95a1d68ef73db2207b"} Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.261664 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.470929 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.474104 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.474142 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.474223 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.559054 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.584161 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.584519 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.584704 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srnzv\" (UniqueName: \"kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.584882 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.584954 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") pod \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\" (UID: \"27917a33-23af-42a4-a85c-3bf8f1f9c1d0\") " Sep 29 10:07:38 crc kubenswrapper[4891]: W0929 10:07:38.585254 4891 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/27917a33-23af-42a4-a85c-3bf8f1f9c1d0/volumes/kubernetes.io~configmap/ovsdbserver-nb Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.585295 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.586020 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.586048 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.588557 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv" (OuterVolumeSpecName: "kube-api-access-srnzv") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "kube-api-access-srnzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.593736 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.615211 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.650225 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config" (OuterVolumeSpecName: "config") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.654251 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27917a33-23af-42a4-a85c-3bf8f1f9c1d0" (UID: "27917a33-23af-42a4-a85c-3bf8f1f9c1d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.687599 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbqfw\" (UniqueName: \"kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw\") pod \"b17acc18-5c95-4d7c-9576-ba976472f02d\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.687821 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts\") pod \"b17acc18-5c95-4d7c-9576-ba976472f02d\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.687976 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data\") pod \"b17acc18-5c95-4d7c-9576-ba976472f02d\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.688031 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle\") pod \"b17acc18-5c95-4d7c-9576-ba976472f02d\" (UID: \"b17acc18-5c95-4d7c-9576-ba976472f02d\") " Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.688617 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.688661 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.688672 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srnzv\" (UniqueName: \"kubernetes.io/projected/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-kube-api-access-srnzv\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.688683 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27917a33-23af-42a4-a85c-3bf8f1f9c1d0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.693429 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts" (OuterVolumeSpecName: "scripts") pod "b17acc18-5c95-4d7c-9576-ba976472f02d" (UID: "b17acc18-5c95-4d7c-9576-ba976472f02d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.694080 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw" (OuterVolumeSpecName: "kube-api-access-pbqfw") pod "b17acc18-5c95-4d7c-9576-ba976472f02d" (UID: "b17acc18-5c95-4d7c-9576-ba976472f02d"). InnerVolumeSpecName "kube-api-access-pbqfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.723351 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data" (OuterVolumeSpecName: "config-data") pod "b17acc18-5c95-4d7c-9576-ba976472f02d" (UID: "b17acc18-5c95-4d7c-9576-ba976472f02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.725773 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b17acc18-5c95-4d7c-9576-ba976472f02d" (UID: "b17acc18-5c95-4d7c-9576-ba976472f02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.784969 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.785232 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.790527 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.790556 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.790571 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17acc18-5c95-4d7c-9576-ba976472f02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:38 crc kubenswrapper[4891]: I0929 10:07:38.790584 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbqfw\" (UniqueName: \"kubernetes.io/projected/b17acc18-5c95-4d7c-9576-ba976472f02d-kube-api-access-pbqfw\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.217442 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d7m4m" event={"ID":"b17acc18-5c95-4d7c-9576-ba976472f02d","Type":"ContainerDied","Data":"0e26d7ae513f42d733feea5cfda1b92cd43360888a4ec46ea5c4371832ac4198"} Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.218516 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e26d7ae513f42d733feea5cfda1b92cd43360888a4ec46ea5c4371832ac4198" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.217474 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d7m4m" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.219728 4891 generic.go:334] "Generic (PLEG): container finished" podID="8c8148fa-ad96-4f4b-8910-7233808ce733" containerID="c9c9b0e6401067a374ff2dd6c76854ccab5f1fcd5c405c04588ebd8e17ef8ea4" exitCode=0 Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.219762 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" event={"ID":"8c8148fa-ad96-4f4b-8910-7233808ce733","Type":"ContainerDied","Data":"c9c9b0e6401067a374ff2dd6c76854ccab5f1fcd5c405c04588ebd8e17ef8ea4"} Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.222526 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.223913 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" event={"ID":"27917a33-23af-42a4-a85c-3bf8f1f9c1d0","Type":"ContainerDied","Data":"31c1766316c7f6b5da97b4049dd08372da38af4bde548ac50ee6e581a9a5cfe0"} Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.223990 4891 scope.go:117] "RemoveContainer" containerID="4299840af8eb34673b5ae9dec18e4b1d4b83b14a3079ad95a1d68ef73db2207b" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.277059 4891 scope.go:117] "RemoveContainer" containerID="6aa9d893d89c701ad32acee5980613f9e2a92e0643ffead4f8d1fa6089941c14" Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.309667 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.338685 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jjh8f"] Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.424140 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.424426 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-log" containerID="cri-o://d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399" gracePeriod=30 Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.424465 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-api" containerID="cri-o://cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a" gracePeriod=30 Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.436996 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.452961 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.453400 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-metadata" containerID="cri-o://8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" gracePeriod=30 Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.453245 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-log" containerID="cri-o://c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" gracePeriod=30 Sep 29 10:07:39 crc kubenswrapper[4891]: I0929 10:07:39.976034 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.015661 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle\") pod \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.015717 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs\") pod \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.015932 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs\") pod \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.015994 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wgwq\" (UniqueName: \"kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq\") pod \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.016028 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data\") pod \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\" (UID: \"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.022471 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs" (OuterVolumeSpecName: "logs") pod "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" (UID: "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.036023 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq" (OuterVolumeSpecName: "kube-api-access-4wgwq") pod "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" (UID: "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be"). InnerVolumeSpecName "kube-api-access-4wgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.051101 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data" (OuterVolumeSpecName: "config-data") pod "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" (UID: "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.051225 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" (UID: "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.085597 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" (UID: "15eac9a5-90a2-49b9-9bdd-0a6074b5e4be"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.118475 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.118555 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wgwq\" (UniqueName: \"kubernetes.io/projected/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-kube-api-access-4wgwq\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.118572 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.118585 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.118598 4891 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.232521 4891 generic.go:334] "Generic (PLEG): container finished" podID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerID="d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399" exitCode=143 Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.232538 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerDied","Data":"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399"} Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235115 4891 generic.go:334] "Generic (PLEG): container finished" podID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerID="8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" exitCode=0 Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235159 4891 generic.go:334] "Generic (PLEG): container finished" podID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerID="c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" exitCode=143 Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235173 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerDied","Data":"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c"} Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235202 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerDied","Data":"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da"} Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235215 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15eac9a5-90a2-49b9-9bdd-0a6074b5e4be","Type":"ContainerDied","Data":"437e08cb26292bf1e6d22c64904af4ad1821828435827743513606a4a452ab39"} Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235231 4891 scope.go:117] "RemoveContainer" containerID="8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.235159 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.240507 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerName="nova-scheduler-scheduler" containerID="cri-o://0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" gracePeriod=30 Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.289005 4891 scope.go:117] "RemoveContainer" containerID="c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.292937 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.312855 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.318967 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.319537 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-metadata" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319555 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-metadata" Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.319578 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-log" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319597 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-log" Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.319609 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17acc18-5c95-4d7c-9576-ba976472f02d" containerName="nova-manage" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319615 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17acc18-5c95-4d7c-9576-ba976472f02d" containerName="nova-manage" Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.319635 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="init" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319641 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="init" Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.319650 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="dnsmasq-dns" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319658 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="dnsmasq-dns" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319884 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="dnsmasq-dns" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319898 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17acc18-5c95-4d7c-9576-ba976472f02d" containerName="nova-manage" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319910 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-log" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.319915 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" containerName="nova-metadata-metadata" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.322383 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.334761 4891 scope.go:117] "RemoveContainer" containerID="8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.335137 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.337180 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.338315 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c\": container with ID starting with 8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c not found: ID does not exist" containerID="8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.338413 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c"} err="failed to get container status \"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c\": rpc error: code = NotFound desc = could not find container \"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c\": container with ID starting with 8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c not found: ID does not exist" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.338504 4891 scope.go:117] "RemoveContainer" containerID="c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" Sep 29 10:07:40 crc kubenswrapper[4891]: E0929 10:07:40.338813 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da\": container with ID starting with c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da not found: ID does not exist" containerID="c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.338890 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da"} err="failed to get container status \"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da\": rpc error: code = NotFound desc = could not find container \"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da\": container with ID starting with c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da not found: ID does not exist" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.338956 4891 scope.go:117] "RemoveContainer" containerID="8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.338929 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.339479 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c"} err="failed to get container status \"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c\": rpc error: code = NotFound desc = could not find container \"8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c\": container with ID starting with 8e2fe12b3f84000f5cc30df842c398fe573021442db61c57e7d090bd5390070c not found: ID does not exist" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.339515 4891 scope.go:117] "RemoveContainer" containerID="c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.339698 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da"} err="failed to get container status \"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da\": rpc error: code = NotFound desc = could not find container \"c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da\": container with ID starting with c52993935dc82b6896aae181c0b23afe37e3865b8be24f996cd555158fc045da not found: ID does not exist" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.413896 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15eac9a5-90a2-49b9-9bdd-0a6074b5e4be" path="/var/lib/kubelet/pods/15eac9a5-90a2-49b9-9bdd-0a6074b5e4be/volumes" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.414591 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" path="/var/lib/kubelet/pods/27917a33-23af-42a4-a85c-3bf8f1f9c1d0/volumes" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.424329 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djd7s\" (UniqueName: \"kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.424445 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.424472 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.424519 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.424548 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.525815 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.525897 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.525926 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.526003 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd7s\" (UniqueName: \"kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.526083 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.526358 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.543334 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.548322 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.552357 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djd7s\" (UniqueName: \"kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.558196 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.643100 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.662416 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.728611 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts\") pod \"8c8148fa-ad96-4f4b-8910-7233808ce733\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.729136 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle\") pod \"8c8148fa-ad96-4f4b-8910-7233808ce733\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.729186 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data\") pod \"8c8148fa-ad96-4f4b-8910-7233808ce733\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.729220 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gdqg\" (UniqueName: \"kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg\") pod \"8c8148fa-ad96-4f4b-8910-7233808ce733\" (UID: \"8c8148fa-ad96-4f4b-8910-7233808ce733\") " Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.734856 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts" (OuterVolumeSpecName: "scripts") pod "8c8148fa-ad96-4f4b-8910-7233808ce733" (UID: "8c8148fa-ad96-4f4b-8910-7233808ce733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.738245 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg" (OuterVolumeSpecName: "kube-api-access-2gdqg") pod "8c8148fa-ad96-4f4b-8910-7233808ce733" (UID: "8c8148fa-ad96-4f4b-8910-7233808ce733"). InnerVolumeSpecName "kube-api-access-2gdqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.769720 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data" (OuterVolumeSpecName: "config-data") pod "8c8148fa-ad96-4f4b-8910-7233808ce733" (UID: "8c8148fa-ad96-4f4b-8910-7233808ce733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.770509 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8148fa-ad96-4f4b-8910-7233808ce733" (UID: "8c8148fa-ad96-4f4b-8910-7233808ce733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.836767 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.836819 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.836832 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8148fa-ad96-4f4b-8910-7233808ce733-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:40 crc kubenswrapper[4891]: I0929 10:07:40.836841 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gdqg\" (UniqueName: \"kubernetes.io/projected/8c8148fa-ad96-4f4b-8910-7233808ce733-kube-api-access-2gdqg\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.189350 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:07:41 crc kubenswrapper[4891]: W0929 10:07:41.204285 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b5291b_8d02_4e53_acdf_4e42b181ec2a.slice/crio-d86e8feb654b9ef239ae43c6eb7710155de4f3ee92544bff2cb09af64667a3dc WatchSource:0}: Error finding container d86e8feb654b9ef239ae43c6eb7710155de4f3ee92544bff2cb09af64667a3dc: Status 404 returned error can't find the container with id d86e8feb654b9ef239ae43c6eb7710155de4f3ee92544bff2cb09af64667a3dc Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.250208 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerStarted","Data":"d86e8feb654b9ef239ae43c6eb7710155de4f3ee92544bff2cb09af64667a3dc"} Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.258913 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" event={"ID":"8c8148fa-ad96-4f4b-8910-7233808ce733","Type":"ContainerDied","Data":"395c5c36adf7c502a73d4c55d43ed9396227291a48784b9a0b2ea8c5ad23f822"} Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.258955 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395c5c36adf7c502a73d4c55d43ed9396227291a48784b9a0b2ea8c5ad23f822" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.259024 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kq9cj" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.342357 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:07:41 crc kubenswrapper[4891]: E0929 10:07:41.342904 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8148fa-ad96-4f4b-8910-7233808ce733" containerName="nova-cell1-conductor-db-sync" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.342920 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8148fa-ad96-4f4b-8910-7233808ce733" containerName="nova-cell1-conductor-db-sync" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.343156 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8148fa-ad96-4f4b-8910-7233808ce733" containerName="nova-cell1-conductor-db-sync" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.344068 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.348413 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.360292 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.451490 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92n8m\" (UniqueName: \"kubernetes.io/projected/854cde66-3c80-472d-b232-45231eef0bbd-kube-api-access-92n8m\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.451733 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.451843 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.553357 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.553948 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92n8m\" (UniqueName: \"kubernetes.io/projected/854cde66-3c80-472d-b232-45231eef0bbd-kube-api-access-92n8m\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.554171 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.558134 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.558381 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854cde66-3c80-472d-b232-45231eef0bbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.572862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92n8m\" (UniqueName: \"kubernetes.io/projected/854cde66-3c80-472d-b232-45231eef0bbd-kube-api-access-92n8m\") pod \"nova-cell1-conductor-0\" (UID: \"854cde66-3c80-472d-b232-45231eef0bbd\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:41 crc kubenswrapper[4891]: I0929 10:07:41.674873 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:42 crc kubenswrapper[4891]: I0929 10:07:42.258299 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:07:42 crc kubenswrapper[4891]: W0929 10:07:42.266121 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod854cde66_3c80_472d_b232_45231eef0bbd.slice/crio-dcc2920b03bf147d2e78130a1794065ec9a4e6676890e7701adb0c36f9c0515c WatchSource:0}: Error finding container dcc2920b03bf147d2e78130a1794065ec9a4e6676890e7701adb0c36f9c0515c: Status 404 returned error can't find the container with id dcc2920b03bf147d2e78130a1794065ec9a4e6676890e7701adb0c36f9c0515c Sep 29 10:07:42 crc kubenswrapper[4891]: I0929 10:07:42.295441 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerStarted","Data":"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902"} Sep 29 10:07:42 crc kubenswrapper[4891]: I0929 10:07:42.295642 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerStarted","Data":"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67"} Sep 29 10:07:42 crc kubenswrapper[4891]: I0929 10:07:42.323217 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.32318883 podStartE2EDuration="2.32318883s" podCreationTimestamp="2025-09-29 10:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:42.314804311 +0000 UTC m=+1192.519972652" watchObservedRunningTime="2025-09-29 10:07:42.32318883 +0000 UTC m=+1192.528357171" Sep 29 10:07:42 crc kubenswrapper[4891]: E0929 10:07:42.738689 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:07:42 crc kubenswrapper[4891]: E0929 10:07:42.740922 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:07:42 crc kubenswrapper[4891]: E0929 10:07:42.742317 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:07:42 crc kubenswrapper[4891]: E0929 10:07:42.742392 4891 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerName="nova-scheduler-scheduler" Sep 29 10:07:43 crc kubenswrapper[4891]: I0929 10:07:43.310771 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"854cde66-3c80-472d-b232-45231eef0bbd","Type":"ContainerStarted","Data":"603148280c3c34101839763d8ca07ce5baaf3c2a08a53751b7f561bc846f95f2"} Sep 29 10:07:43 crc kubenswrapper[4891]: I0929 10:07:43.310865 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"854cde66-3c80-472d-b232-45231eef0bbd","Type":"ContainerStarted","Data":"dcc2920b03bf147d2e78130a1794065ec9a4e6676890e7701adb0c36f9c0515c"} Sep 29 10:07:43 crc kubenswrapper[4891]: I0929 10:07:43.315452 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-jjh8f" podUID="27917a33-23af-42a4-a85c-3bf8f1f9c1d0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Sep 29 10:07:43 crc kubenswrapper[4891]: I0929 10:07:43.340812 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.340768015 podStartE2EDuration="2.340768015s" podCreationTimestamp="2025-09-29 10:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:43.332205311 +0000 UTC m=+1193.537373672" watchObservedRunningTime="2025-09-29 10:07:43.340768015 +0000 UTC m=+1193.545936336" Sep 29 10:07:44 crc kubenswrapper[4891]: I0929 10:07:44.324310 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.041146 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.137036 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle\") pod \"3cf2b7a9-9892-47f9-b290-775806be28cc\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.137801 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data\") pod \"3cf2b7a9-9892-47f9-b290-775806be28cc\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.137872 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxjpg\" (UniqueName: \"kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg\") pod \"3cf2b7a9-9892-47f9-b290-775806be28cc\" (UID: \"3cf2b7a9-9892-47f9-b290-775806be28cc\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.157478 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg" (OuterVolumeSpecName: "kube-api-access-cxjpg") pod "3cf2b7a9-9892-47f9-b290-775806be28cc" (UID: "3cf2b7a9-9892-47f9-b290-775806be28cc"). InnerVolumeSpecName "kube-api-access-cxjpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.164073 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data" (OuterVolumeSpecName: "config-data") pod "3cf2b7a9-9892-47f9-b290-775806be28cc" (UID: "3cf2b7a9-9892-47f9-b290-775806be28cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.191773 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cf2b7a9-9892-47f9-b290-775806be28cc" (UID: "3cf2b7a9-9892-47f9-b290-775806be28cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.240491 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.240531 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf2b7a9-9892-47f9-b290-775806be28cc-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.240541 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxjpg\" (UniqueName: \"kubernetes.io/projected/3cf2b7a9-9892-47f9-b290-775806be28cc-kube-api-access-cxjpg\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.305453 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.335445 4891 generic.go:334] "Generic (PLEG): container finished" podID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" exitCode=0 Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.335508 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.335528 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3cf2b7a9-9892-47f9-b290-775806be28cc","Type":"ContainerDied","Data":"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3"} Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.335560 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3cf2b7a9-9892-47f9-b290-775806be28cc","Type":"ContainerDied","Data":"bad59f544f0d7a8cf8c1d42abd134a70827b3835f202344d0cdba539d30fb7d9"} Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.335576 4891 scope.go:117] "RemoveContainer" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342005 4891 generic.go:334] "Generic (PLEG): container finished" podID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerID="cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a" exitCode=0 Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342051 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342080 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n742m\" (UniqueName: \"kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m\") pod \"62b86eba-1d95-4b53-b84c-902ff2665b50\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342126 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data\") pod \"62b86eba-1d95-4b53-b84c-902ff2665b50\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342078 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerDied","Data":"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a"} Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342168 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle\") pod \"62b86eba-1d95-4b53-b84c-902ff2665b50\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342193 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b86eba-1d95-4b53-b84c-902ff2665b50","Type":"ContainerDied","Data":"2c03290a437a656cb39b45074eddcc2cadcc552334ac2f289abc81902984accb"} Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.342211 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs\") pod \"62b86eba-1d95-4b53-b84c-902ff2665b50\" (UID: \"62b86eba-1d95-4b53-b84c-902ff2665b50\") " Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.343004 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs" (OuterVolumeSpecName: "logs") pod "62b86eba-1d95-4b53-b84c-902ff2665b50" (UID: "62b86eba-1d95-4b53-b84c-902ff2665b50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.346932 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m" (OuterVolumeSpecName: "kube-api-access-n742m") pod "62b86eba-1d95-4b53-b84c-902ff2665b50" (UID: "62b86eba-1d95-4b53-b84c-902ff2665b50"). InnerVolumeSpecName "kube-api-access-n742m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.379458 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.404772 4891 scope.go:117] "RemoveContainer" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.405394 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3\": container with ID starting with 0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3 not found: ID does not exist" containerID="0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.405434 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3"} err="failed to get container status \"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3\": rpc error: code = NotFound desc = could not find container \"0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3\": container with ID starting with 0dfcaad1d2aa58b0e3b97cf3fa080dad29331b2753de4550b1329ac394e520d3 not found: ID does not exist" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.405457 4891 scope.go:117] "RemoveContainer" containerID="cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.408508 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62b86eba-1d95-4b53-b84c-902ff2665b50" (UID: "62b86eba-1d95-4b53-b84c-902ff2665b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.408637 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data" (OuterVolumeSpecName: "config-data") pod "62b86eba-1d95-4b53-b84c-902ff2665b50" (UID: "62b86eba-1d95-4b53-b84c-902ff2665b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.418014 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.425822 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.426273 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-log" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426293 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-log" Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.426305 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-api" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426312 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-api" Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.426355 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerName="nova-scheduler-scheduler" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426361 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerName="nova-scheduler-scheduler" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426533 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" containerName="nova-scheduler-scheduler" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426562 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-api" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426576 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" containerName="nova-api-log" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.426862 4891 scope.go:117] "RemoveContainer" containerID="d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.427428 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.429933 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.437304 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445108 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445324 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsnnv\" (UniqueName: \"kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445469 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445579 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n742m\" (UniqueName: \"kubernetes.io/projected/62b86eba-1d95-4b53-b84c-902ff2665b50-kube-api-access-n742m\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445601 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445615 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b86eba-1d95-4b53-b84c-902ff2665b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.445628 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b86eba-1d95-4b53-b84c-902ff2665b50-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.448879 4891 scope.go:117] "RemoveContainer" containerID="cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a" Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.449392 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a\": container with ID starting with cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a not found: ID does not exist" containerID="cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.449422 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a"} err="failed to get container status \"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a\": rpc error: code = NotFound desc = could not find container \"cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a\": container with ID starting with cc70d7a9d7d7e352a6092b4334e2d05a8d1e5350fa216097c4758243cef8220a not found: ID does not exist" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.449446 4891 scope.go:117] "RemoveContainer" containerID="d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399" Sep 29 10:07:45 crc kubenswrapper[4891]: E0929 10:07:45.449848 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399\": container with ID starting with d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399 not found: ID does not exist" containerID="d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.449870 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399"} err="failed to get container status \"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399\": rpc error: code = NotFound desc = could not find container \"d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399\": container with ID starting with d87afe6c42465f39dc5b47f8faafc1c99b8f5b7c813030210af054dbee3f8399 not found: ID does not exist" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.547881 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnnv\" (UniqueName: \"kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.547977 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.548030 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.551516 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.552652 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.569473 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsnnv\" (UniqueName: \"kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv\") pod \"nova-scheduler-0\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.662611 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.663019 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.746607 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.746879 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.766099 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.780096 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.782355 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.785471 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.800251 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.856272 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwsf\" (UniqueName: \"kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.856390 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.856568 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.857196 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.959541 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwsf\" (UniqueName: \"kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.959592 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.959626 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.959742 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.960844 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.965620 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.976943 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:45 crc kubenswrapper[4891]: I0929 10:07:45.981844 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwsf\" (UniqueName: \"kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf\") pod \"nova-api-0\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " pod="openstack/nova-api-0" Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.176290 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.253606 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:07:46 crc kubenswrapper[4891]: W0929 10:07:46.260479 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc501d405_3ec0_4276_bf49_cb633ede21fe.slice/crio-7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df WatchSource:0}: Error finding container 7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df: Status 404 returned error can't find the container with id 7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.367674 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c501d405-3ec0-4276-bf49-cb633ede21fe","Type":"ContainerStarted","Data":"7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df"} Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.410130 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf2b7a9-9892-47f9-b290-775806be28cc" path="/var/lib/kubelet/pods/3cf2b7a9-9892-47f9-b290-775806be28cc/volumes" Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.411213 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b86eba-1d95-4b53-b84c-902ff2665b50" path="/var/lib/kubelet/pods/62b86eba-1d95-4b53-b84c-902ff2665b50/volumes" Sep 29 10:07:46 crc kubenswrapper[4891]: W0929 10:07:46.676196 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4449e962_4991_430a_8104_10c1780fa253.slice/crio-7ceb868a21770b87fe8c3829db5e82e538b7d1abd829022c4e245b702cd08b62 WatchSource:0}: Error finding container 7ceb868a21770b87fe8c3829db5e82e538b7d1abd829022c4e245b702cd08b62: Status 404 returned error can't find the container with id 7ceb868a21770b87fe8c3829db5e82e538b7d1abd829022c4e245b702cd08b62 Sep 29 10:07:46 crc kubenswrapper[4891]: I0929 10:07:46.677279 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.308935 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.379035 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c501d405-3ec0-4276-bf49-cb633ede21fe","Type":"ContainerStarted","Data":"264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204"} Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.381415 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerStarted","Data":"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a"} Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.381455 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerStarted","Data":"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4"} Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.381466 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerStarted","Data":"7ceb868a21770b87fe8c3829db5e82e538b7d1abd829022c4e245b702cd08b62"} Sep 29 10:07:47 crc kubenswrapper[4891]: I0929 10:07:47.408686 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.408667335 podStartE2EDuration="2.408667335s" podCreationTimestamp="2025-09-29 10:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:47.401303335 +0000 UTC m=+1197.606471706" watchObservedRunningTime="2025-09-29 10:07:47.408667335 +0000 UTC m=+1197.613835656" Sep 29 10:07:50 crc kubenswrapper[4891]: I0929 10:07:50.663312 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:07:50 crc kubenswrapper[4891]: I0929 10:07:50.664270 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:07:50 crc kubenswrapper[4891]: I0929 10:07:50.748182 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:07:51 crc kubenswrapper[4891]: I0929 10:07:51.674254 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:51 crc kubenswrapper[4891]: I0929 10:07:51.674304 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:51 crc kubenswrapper[4891]: I0929 10:07:51.724131 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 29 10:07:51 crc kubenswrapper[4891]: I0929 10:07:51.745602 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=6.745575221 podStartE2EDuration="6.745575221s" podCreationTimestamp="2025-09-29 10:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:07:47.433225185 +0000 UTC m=+1197.638393536" watchObservedRunningTime="2025-09-29 10:07:51.745575221 +0000 UTC m=+1201.950743542" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.449354 4891 generic.go:334] "Generic (PLEG): container finished" podID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerID="83d1f3977ce6fa274eb7b28120ff27e9f06ce8eefb41e86207e84443ece6f2b3" exitCode=137 Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.449636 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerDied","Data":"83d1f3977ce6fa274eb7b28120ff27e9f06ce8eefb41e86207e84443ece6f2b3"} Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.449669 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c6db8d4-63be-4cd4-9046-3291d40d5bf9","Type":"ContainerDied","Data":"d413473fd4b7638978f1cfb440e886aa45667edcec269c5c7f388be23b87d092"} Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.449685 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d413473fd4b7638978f1cfb440e886aa45667edcec269c5c7f388be23b87d092" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.478751 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.561348 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.561455 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.561485 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.562244 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.563337 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.563419 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.563471 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4c4d\" (UniqueName: \"kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.564107 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle\") pod \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\" (UID: \"1c6db8d4-63be-4cd4-9046-3291d40d5bf9\") " Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.564507 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.565549 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.565572 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.590379 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts" (OuterVolumeSpecName: "scripts") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.593458 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d" (OuterVolumeSpecName: "kube-api-access-k4c4d") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "kube-api-access-k4c4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.605827 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.645838 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.667906 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.667944 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.667954 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4c4d\" (UniqueName: \"kubernetes.io/projected/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-kube-api-access-k4c4d\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.667966 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.680696 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data" (OuterVolumeSpecName: "config-data") pod "1c6db8d4-63be-4cd4-9046-3291d40d5bf9" (UID: "1c6db8d4-63be-4cd4-9046-3291d40d5bf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:07:52 crc kubenswrapper[4891]: I0929 10:07:52.769783 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6db8d4-63be-4cd4-9046-3291d40d5bf9-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.458926 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.501434 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.513807 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.529706 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:53 crc kubenswrapper[4891]: E0929 10:07:53.530361 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-central-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530392 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-central-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: E0929 10:07:53.530440 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="sg-core" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530452 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="sg-core" Sep 29 10:07:53 crc kubenswrapper[4891]: E0929 10:07:53.530467 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="proxy-httpd" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530479 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="proxy-httpd" Sep 29 10:07:53 crc kubenswrapper[4891]: E0929 10:07:53.530497 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-notification-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530508 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-notification-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530839 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-notification-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530868 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="sg-core" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530889 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="proxy-httpd" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.530901 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" containerName="ceilometer-central-agent" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.533602 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.537032 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.537046 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.540000 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.696830 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.696953 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsjc\" (UniqueName: \"kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.697184 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.697324 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.697393 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.697557 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.697675 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799462 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799596 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snsjc\" (UniqueName: \"kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799645 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799678 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799706 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799745 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.799778 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.800370 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.801626 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.805220 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.807006 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.809140 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.809272 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.829995 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsjc\" (UniqueName: \"kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc\") pod \"ceilometer-0\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " pod="openstack/ceilometer-0" Sep 29 10:07:53 crc kubenswrapper[4891]: I0929 10:07:53.859732 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:07:54 crc kubenswrapper[4891]: I0929 10:07:54.347373 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:07:54 crc kubenswrapper[4891]: I0929 10:07:54.410134 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6db8d4-63be-4cd4-9046-3291d40d5bf9" path="/var/lib/kubelet/pods/1c6db8d4-63be-4cd4-9046-3291d40d5bf9/volumes" Sep 29 10:07:54 crc kubenswrapper[4891]: I0929 10:07:54.469158 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerStarted","Data":"2c99f4187694313d38e3e4d7b484d9d469c8ad58f6406fcaeacc16859d9a130e"} Sep 29 10:07:55 crc kubenswrapper[4891]: I0929 10:07:55.479596 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerStarted","Data":"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd"} Sep 29 10:07:55 crc kubenswrapper[4891]: I0929 10:07:55.748065 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:07:55 crc kubenswrapper[4891]: I0929 10:07:55.794728 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:07:56 crc kubenswrapper[4891]: I0929 10:07:56.177354 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:07:56 crc kubenswrapper[4891]: I0929 10:07:56.177691 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:07:56 crc kubenswrapper[4891]: I0929 10:07:56.500418 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerStarted","Data":"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3"} Sep 29 10:07:56 crc kubenswrapper[4891]: I0929 10:07:56.536345 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:07:57 crc kubenswrapper[4891]: I0929 10:07:57.260116 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:57 crc kubenswrapper[4891]: I0929 10:07:57.260130 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:07:57 crc kubenswrapper[4891]: I0929 10:07:57.521133 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerStarted","Data":"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2"} Sep 29 10:07:58 crc kubenswrapper[4891]: I0929 10:07:58.532752 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerStarted","Data":"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30"} Sep 29 10:07:58 crc kubenswrapper[4891]: I0929 10:07:58.533934 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:07:58 crc kubenswrapper[4891]: I0929 10:07:58.578584 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.486519604 podStartE2EDuration="5.578566739s" podCreationTimestamp="2025-09-29 10:07:53 +0000 UTC" firstStartedPulling="2025-09-29 10:07:54.362677252 +0000 UTC m=+1204.567845573" lastFinishedPulling="2025-09-29 10:07:57.454724387 +0000 UTC m=+1207.659892708" observedRunningTime="2025-09-29 10:07:58.566104334 +0000 UTC m=+1208.771272655" watchObservedRunningTime="2025-09-29 10:07:58.578566739 +0000 UTC m=+1208.783735070" Sep 29 10:08:00 crc kubenswrapper[4891]: I0929 10:08:00.669314 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:08:00 crc kubenswrapper[4891]: I0929 10:08:00.670140 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:08:00 crc kubenswrapper[4891]: I0929 10:08:00.676742 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:08:00 crc kubenswrapper[4891]: I0929 10:08:00.678906 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.590125 4891 generic.go:334] "Generic (PLEG): container finished" podID="a94a9f15-db51-47f4-9456-5c64e83e413f" containerID="3ab23e15bdc4bf422e11da792917c1ceac6fdb7285179e467711afac5b806ec0" exitCode=137 Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.590607 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94a9f15-db51-47f4-9456-5c64e83e413f","Type":"ContainerDied","Data":"3ab23e15bdc4bf422e11da792917c1ceac6fdb7285179e467711afac5b806ec0"} Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.590634 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a94a9f15-db51-47f4-9456-5c64e83e413f","Type":"ContainerDied","Data":"7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4"} Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.590644 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7999f5c81ee453651f9597cd68fdebadf56281a124b22b57c0e6f0e20199e8e4" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.640380 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.737548 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle\") pod \"a94a9f15-db51-47f4-9456-5c64e83e413f\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.737698 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwq6k\" (UniqueName: \"kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k\") pod \"a94a9f15-db51-47f4-9456-5c64e83e413f\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.737890 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data\") pod \"a94a9f15-db51-47f4-9456-5c64e83e413f\" (UID: \"a94a9f15-db51-47f4-9456-5c64e83e413f\") " Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.745663 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k" (OuterVolumeSpecName: "kube-api-access-bwq6k") pod "a94a9f15-db51-47f4-9456-5c64e83e413f" (UID: "a94a9f15-db51-47f4-9456-5c64e83e413f"). InnerVolumeSpecName "kube-api-access-bwq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.771542 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a94a9f15-db51-47f4-9456-5c64e83e413f" (UID: "a94a9f15-db51-47f4-9456-5c64e83e413f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.773732 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data" (OuterVolumeSpecName: "config-data") pod "a94a9f15-db51-47f4-9456-5c64e83e413f" (UID: "a94a9f15-db51-47f4-9456-5c64e83e413f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.840769 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.840890 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwq6k\" (UniqueName: \"kubernetes.io/projected/a94a9f15-db51-47f4-9456-5c64e83e413f-kube-api-access-bwq6k\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:03 crc kubenswrapper[4891]: I0929 10:08:03.840906 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94a9f15-db51-47f4-9456-5c64e83e413f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.601927 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.625318 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.635290 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.651119 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:08:04 crc kubenswrapper[4891]: E0929 10:08:04.651534 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94a9f15-db51-47f4-9456-5c64e83e413f" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.651558 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94a9f15-db51-47f4-9456-5c64e83e413f" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.651968 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94a9f15-db51-47f4-9456-5c64e83e413f" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.652609 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.657764 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.657976 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.658194 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.672713 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.760691 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.760742 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.760833 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.760895 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.760922 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzzw\" (UniqueName: \"kubernetes.io/projected/2e574b8c-0b11-4d63-a842-239dbbf69258-kube-api-access-dkzzw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.863072 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.863523 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.864014 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.864387 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.864569 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzzw\" (UniqueName: \"kubernetes.io/projected/2e574b8c-0b11-4d63-a842-239dbbf69258-kube-api-access-dkzzw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.868762 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.869139 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.869188 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.869276 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e574b8c-0b11-4d63-a842-239dbbf69258-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.885292 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzzw\" (UniqueName: \"kubernetes.io/projected/2e574b8c-0b11-4d63-a842-239dbbf69258-kube-api-access-dkzzw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e574b8c-0b11-4d63-a842-239dbbf69258\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:04 crc kubenswrapper[4891]: I0929 10:08:04.970963 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:05 crc kubenswrapper[4891]: W0929 10:08:05.490688 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e574b8c_0b11_4d63_a842_239dbbf69258.slice/crio-d37311c64f8d6033bfabdf4350e61677f37909d8d40d41e208c7874b6683c504 WatchSource:0}: Error finding container d37311c64f8d6033bfabdf4350e61677f37909d8d40d41e208c7874b6683c504: Status 404 returned error can't find the container with id d37311c64f8d6033bfabdf4350e61677f37909d8d40d41e208c7874b6683c504 Sep 29 10:08:05 crc kubenswrapper[4891]: I0929 10:08:05.492995 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:08:05 crc kubenswrapper[4891]: I0929 10:08:05.616233 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e574b8c-0b11-4d63-a842-239dbbf69258","Type":"ContainerStarted","Data":"d37311c64f8d6033bfabdf4350e61677f37909d8d40d41e208c7874b6683c504"} Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.180441 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.180897 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.181285 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.181340 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.183875 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.185855 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.186163 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.186219 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.186256 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.186853 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.186920 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07" gracePeriod=600 Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.436172 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94a9f15-db51-47f4-9456-5c64e83e413f" path="/var/lib/kubelet/pods/a94a9f15-db51-47f4-9456-5c64e83e413f/volumes" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.436736 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.438697 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.449781 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511059 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511408 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511483 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511522 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511582 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.511927 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fjj\" (UniqueName: \"kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614238 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fjj\" (UniqueName: \"kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614321 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614481 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614548 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614586 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.614622 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.615712 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.616846 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.617742 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.617975 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.618307 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.635337 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e574b8c-0b11-4d63-a842-239dbbf69258","Type":"ContainerStarted","Data":"a38ed75edb6e7dff5a68d001367a65028f911d40c0f3b043c02c6313c1bfdd6c"} Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.644905 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07" exitCode=0 Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.645166 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07"} Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.645225 4891 scope.go:117] "RemoveContainer" containerID="65905951ff597e2e9f5a100530eb12ae214ca6c149882ca87ecb04c813df66e6" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.676215 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.676194793 podStartE2EDuration="2.676194793s" podCreationTimestamp="2025-09-29 10:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:06.667267628 +0000 UTC m=+1216.872435949" watchObservedRunningTime="2025-09-29 10:08:06.676194793 +0000 UTC m=+1216.881363114" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.682382 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fjj\" (UniqueName: \"kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj\") pod \"dnsmasq-dns-5c7b6c5df9-2l6lc\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:06 crc kubenswrapper[4891]: I0929 10:08:06.810521 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:07 crc kubenswrapper[4891]: I0929 10:08:07.516710 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:08:07 crc kubenswrapper[4891]: I0929 10:08:07.657872 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960"} Sep 29 10:08:07 crc kubenswrapper[4891]: I0929 10:08:07.660622 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" event={"ID":"013367be-7a48-44db-839f-145a78a17cc1","Type":"ContainerStarted","Data":"884cf13e5a754618beead72f8d3481e8c981e68f6e9ffc4c533ff5031915af2b"} Sep 29 10:08:08 crc kubenswrapper[4891]: I0929 10:08:08.673345 4891 generic.go:334] "Generic (PLEG): container finished" podID="013367be-7a48-44db-839f-145a78a17cc1" containerID="bcbe6effabfe684fc3e2cd64c877af4b60f06ea8d75c6e31f19c8a393874e149" exitCode=0 Sep 29 10:08:08 crc kubenswrapper[4891]: I0929 10:08:08.674643 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" event={"ID":"013367be-7a48-44db-839f-145a78a17cc1","Type":"ContainerDied","Data":"bcbe6effabfe684fc3e2cd64c877af4b60f06ea8d75c6e31f19c8a393874e149"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.007129 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.007680 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-log" containerID="cri-o://aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.007818 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-api" containerID="cri-o://d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.042619 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.042975 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-central-agent" containerID="cri-o://01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.043037 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="proxy-httpd" containerID="cri-o://1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.043087 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-notification-agent" containerID="cri-o://5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.043475 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="sg-core" containerID="cri-o://4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2" gracePeriod=30 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.056413 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.198:3000/\": EOF" Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725469 4891 generic.go:334] "Generic (PLEG): container finished" podID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerID="1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30" exitCode=0 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725802 4891 generic.go:334] "Generic (PLEG): container finished" podID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerID="4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2" exitCode=2 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725814 4891 generic.go:334] "Generic (PLEG): container finished" podID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerID="01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd" exitCode=0 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725545 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerDied","Data":"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725883 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerDied","Data":"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.725902 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerDied","Data":"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.727712 4891 generic.go:334] "Generic (PLEG): container finished" podID="4449e962-4991-430a-8104-10c1780fa253" containerID="aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4" exitCode=143 Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.727805 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerDied","Data":"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.729420 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" event={"ID":"013367be-7a48-44db-839f-145a78a17cc1","Type":"ContainerStarted","Data":"8c176e2a9a45eadf839b665216c6b347cfab4bba431f2a16029c3d64ad1d0ece"} Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.730368 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:09 crc kubenswrapper[4891]: I0929 10:08:09.971776 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.630345 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.661360 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kwsf\" (UniqueName: \"kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf\") pod \"4449e962-4991-430a-8104-10c1780fa253\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.661479 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle\") pod \"4449e962-4991-430a-8104-10c1780fa253\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.661508 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data\") pod \"4449e962-4991-430a-8104-10c1780fa253\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.661578 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs\") pod \"4449e962-4991-430a-8104-10c1780fa253\" (UID: \"4449e962-4991-430a-8104-10c1780fa253\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.663612 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs" (OuterVolumeSpecName: "logs") pod "4449e962-4991-430a-8104-10c1780fa253" (UID: "4449e962-4991-430a-8104-10c1780fa253"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.676038 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" podStartSLOduration=6.6760109100000005 podStartE2EDuration="6.67601091s" podCreationTimestamp="2025-09-29 10:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:09.802937046 +0000 UTC m=+1220.008105367" watchObservedRunningTime="2025-09-29 10:08:12.67601091 +0000 UTC m=+1222.881179231" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.688772 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf" (OuterVolumeSpecName: "kube-api-access-5kwsf") pod "4449e962-4991-430a-8104-10c1780fa253" (UID: "4449e962-4991-430a-8104-10c1780fa253"). InnerVolumeSpecName "kube-api-access-5kwsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.696921 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data" (OuterVolumeSpecName: "config-data") pod "4449e962-4991-430a-8104-10c1780fa253" (UID: "4449e962-4991-430a-8104-10c1780fa253"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.752042 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4449e962-4991-430a-8104-10c1780fa253" (UID: "4449e962-4991-430a-8104-10c1780fa253"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.764954 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kwsf\" (UniqueName: \"kubernetes.io/projected/4449e962-4991-430a-8104-10c1780fa253-kube-api-access-5kwsf\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.764994 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.765008 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4449e962-4991-430a-8104-10c1780fa253-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.765020 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4449e962-4991-430a-8104-10c1780fa253-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.769434 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.775944 4891 generic.go:334] "Generic (PLEG): container finished" podID="4449e962-4991-430a-8104-10c1780fa253" containerID="d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a" exitCode=0 Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.776026 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerDied","Data":"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a"} Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.776057 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4449e962-4991-430a-8104-10c1780fa253","Type":"ContainerDied","Data":"7ceb868a21770b87fe8c3829db5e82e538b7d1abd829022c4e245b702cd08b62"} Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.776078 4891 scope.go:117] "RemoveContainer" containerID="d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.776072 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.781685 4891 generic.go:334] "Generic (PLEG): container finished" podID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerID="5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3" exitCode=0 Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.781735 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerDied","Data":"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3"} Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.781766 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f","Type":"ContainerDied","Data":"2c99f4187694313d38e3e4d7b484d9d469c8ad58f6406fcaeacc16859d9a130e"} Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.781866 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.826802 4891 scope.go:117] "RemoveContainer" containerID="aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.830326 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.849083 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866300 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866416 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snsjc\" (UniqueName: \"kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866489 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866528 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866592 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866630 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.866717 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts\") pod \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\" (UID: \"0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f\") " Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.869982 4891 scope.go:117] "RemoveContainer" containerID="d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.870186 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.870410 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.872936 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a\": container with ID starting with d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a not found: ID does not exist" containerID="d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.873015 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a"} err="failed to get container status \"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a\": rpc error: code = NotFound desc = could not find container \"d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a\": container with ID starting with d5af47b5ab9013118caf7008b608d4d0feb624cdc52f0f539e38c363f66ea27a not found: ID does not exist" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.873048 4891 scope.go:117] "RemoveContainer" containerID="aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.873928 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts" (OuterVolumeSpecName: "scripts") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.874090 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4\": container with ID starting with aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4 not found: ID does not exist" containerID="aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.874141 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4"} err="failed to get container status \"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4\": rpc error: code = NotFound desc = could not find container \"aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4\": container with ID starting with aba5fe528a4185569895dcce97e4f5341f065189b3260c5f01d792d586439aa4 not found: ID does not exist" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.874173 4891 scope.go:117] "RemoveContainer" containerID="1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.876191 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc" (OuterVolumeSpecName: "kube-api-access-snsjc") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "kube-api-access-snsjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888225 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888773 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="sg-core" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888805 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="sg-core" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888827 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-log" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888836 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-log" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888855 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="proxy-httpd" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888864 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="proxy-httpd" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888877 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-notification-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888884 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-notification-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888900 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-api" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888907 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-api" Sep 29 10:08:12 crc kubenswrapper[4891]: E0929 10:08:12.888936 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-central-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.888945 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-central-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889188 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="sg-core" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889203 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-api" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889217 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="proxy-httpd" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889231 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-notification-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889252 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="4449e962-4991-430a-8104-10c1780fa253" containerName="nova-api-log" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.889263 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" containerName="ceilometer-central-agent" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.890590 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.893386 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.893674 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.893743 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.913098 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.941144 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.969696 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.969774 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970059 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970217 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970476 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970599 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9bc\" (UniqueName: \"kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970680 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970707 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snsjc\" (UniqueName: \"kubernetes.io/projected/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-kube-api-access-snsjc\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970722 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970734 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.970745 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:12 crc kubenswrapper[4891]: I0929 10:08:12.972851 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.040556 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data" (OuterVolumeSpecName: "config-data") pod "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" (UID: "0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.073744 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.072853 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.074131 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.074327 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.074445 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.074783 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.074928 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9bc\" (UniqueName: \"kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.075153 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.075203 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.078365 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.078377 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.080289 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.083521 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.094653 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9bc\" (UniqueName: \"kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc\") pod \"nova-api-0\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.185587 4891 scope.go:117] "RemoveContainer" containerID="4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.208286 4891 scope.go:117] "RemoveContainer" containerID="5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.215337 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.250041 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.261359 4891 scope.go:117] "RemoveContainer" containerID="01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.262632 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.275147 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.280905 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.283537 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.283880 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.284946 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.296779 4891 scope.go:117] "RemoveContainer" containerID="1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30" Sep 29 10:08:13 crc kubenswrapper[4891]: E0929 10:08:13.300023 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30\": container with ID starting with 1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30 not found: ID does not exist" containerID="1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.300086 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30"} err="failed to get container status \"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30\": rpc error: code = NotFound desc = could not find container \"1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30\": container with ID starting with 1337017575fdcb92c7ee3a5b55cb2291f511910762dce1557c876569d77d4c30 not found: ID does not exist" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.300120 4891 scope.go:117] "RemoveContainer" containerID="4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2" Sep 29 10:08:13 crc kubenswrapper[4891]: E0929 10:08:13.300613 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2\": container with ID starting with 4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2 not found: ID does not exist" containerID="4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.300695 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2"} err="failed to get container status \"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2\": rpc error: code = NotFound desc = could not find container \"4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2\": container with ID starting with 4f12d6d25e89cac9a72761c688c602f1fa3e33d4189c82225dbf4ce111e6dee2 not found: ID does not exist" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.300729 4891 scope.go:117] "RemoveContainer" containerID="5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3" Sep 29 10:08:13 crc kubenswrapper[4891]: E0929 10:08:13.301390 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3\": container with ID starting with 5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3 not found: ID does not exist" containerID="5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.301424 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3"} err="failed to get container status \"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3\": rpc error: code = NotFound desc = could not find container \"5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3\": container with ID starting with 5969c156d3b6d90ab521ae7d3964e93463ea28d7a4928f1901f2444c8cf925b3 not found: ID does not exist" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.301466 4891 scope.go:117] "RemoveContainer" containerID="01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd" Sep 29 10:08:13 crc kubenswrapper[4891]: E0929 10:08:13.301833 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd\": container with ID starting with 01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd not found: ID does not exist" containerID="01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.301887 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd"} err="failed to get container status \"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd\": rpc error: code = NotFound desc = could not find container \"01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd\": container with ID starting with 01cafd44e71baf108a4696f5cba9cc3ed5529867559c53998282a2539bad8cfd not found: ID does not exist" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.386935 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.387179 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.387453 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jsj\" (UniqueName: \"kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.387644 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.387739 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.387860 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.388058 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490336 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490440 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jsj\" (UniqueName: \"kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490490 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490538 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490565 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490643 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.490724 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.491574 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.492088 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.495764 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.496101 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.496482 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.497620 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.512483 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jsj\" (UniqueName: \"kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj\") pod \"ceilometer-0\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.613538 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.766778 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:13 crc kubenswrapper[4891]: W0929 10:08:13.770107 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3acc3f25_d4d9_44d9_87c8_69d5f5b51e61.slice/crio-c74b9a2d1e212af50dcdc7c8c9a3924d9fd63d794653b06bdd7c10e7948ff41c WatchSource:0}: Error finding container c74b9a2d1e212af50dcdc7c8c9a3924d9fd63d794653b06bdd7c10e7948ff41c: Status 404 returned error can't find the container with id c74b9a2d1e212af50dcdc7c8c9a3924d9fd63d794653b06bdd7c10e7948ff41c Sep 29 10:08:13 crc kubenswrapper[4891]: I0929 10:08:13.793956 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerStarted","Data":"c74b9a2d1e212af50dcdc7c8c9a3924d9fd63d794653b06bdd7c10e7948ff41c"} Sep 29 10:08:14 crc kubenswrapper[4891]: W0929 10:08:14.077524 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc908bd0e_e209_465b_b98e_f892a4a270f8.slice/crio-2e378fd9fce92a03c84200e0004270fa67af34706e747cf40cd8933f5d8de934 WatchSource:0}: Error finding container 2e378fd9fce92a03c84200e0004270fa67af34706e747cf40cd8933f5d8de934: Status 404 returned error can't find the container with id 2e378fd9fce92a03c84200e0004270fa67af34706e747cf40cd8933f5d8de934 Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.077974 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.408356 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f" path="/var/lib/kubelet/pods/0a54b7e2-d3ce-4a8e-af38-a8569a40dd2f/volumes" Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.409955 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4449e962-4991-430a-8104-10c1780fa253" path="/var/lib/kubelet/pods/4449e962-4991-430a-8104-10c1780fa253/volumes" Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.808145 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerStarted","Data":"3abecfeae70fc5eb763ea1154d6f4257217020142608f34ade65ff037077926e"} Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.808187 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerStarted","Data":"f9776fa1bb9f8c227a8826ae796e21cde64af83d1dd806d5cdd0e999972f5aae"} Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.818586 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerStarted","Data":"0e19292bf3d457000f279f22b4cbcec64844d45e05e59085a0a9c3ff82e11d3a"} Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.818629 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerStarted","Data":"2e378fd9fce92a03c84200e0004270fa67af34706e747cf40cd8933f5d8de934"} Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.828618 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.828595726 podStartE2EDuration="2.828595726s" podCreationTimestamp="2025-09-29 10:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:14.823864322 +0000 UTC m=+1225.029032643" watchObservedRunningTime="2025-09-29 10:08:14.828595726 +0000 UTC m=+1225.033764047" Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.971328 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:14 crc kubenswrapper[4891]: I0929 10:08:14.992959 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:15 crc kubenswrapper[4891]: I0929 10:08:15.851551 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.144933 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4tx5q"] Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.146938 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.149970 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.157214 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.160650 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4tx5q"] Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.248465 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.248994 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.249158 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.249208 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8jpn\" (UniqueName: \"kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.350876 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.351045 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.351098 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.351125 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8jpn\" (UniqueName: \"kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.357718 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.358033 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.360518 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.383199 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8jpn\" (UniqueName: \"kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn\") pod \"nova-cell1-cell-mapping-4tx5q\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.481604 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.813567 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.845887 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerStarted","Data":"7d2a614dd8b9665552651072927ba040e646547ba95bfaec9319b47e006b8b62"} Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.922611 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.922966 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="dnsmasq-dns" containerID="cri-o://c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c" gracePeriod=10 Sep 29 10:08:16 crc kubenswrapper[4891]: I0929 10:08:16.984056 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4tx5q"] Sep 29 10:08:17 crc kubenswrapper[4891]: W0929 10:08:17.001661 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15c459e2_314b_44bd_9e6a_7ae0b907b4b7.slice/crio-447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85 WatchSource:0}: Error finding container 447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85: Status 404 returned error can't find the container with id 447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85 Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.458596 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594331 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594451 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594484 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594548 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m95xl\" (UniqueName: \"kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594648 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.594673 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config\") pod \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\" (UID: \"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4\") " Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.602206 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl" (OuterVolumeSpecName: "kube-api-access-m95xl") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "kube-api-access-m95xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.648368 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.657269 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.660411 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.662090 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.666573 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config" (OuterVolumeSpecName: "config") pod "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" (UID: "dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700392 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700442 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700461 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700472 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700483 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.700495 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m95xl\" (UniqueName: \"kubernetes.io/projected/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4-kube-api-access-m95xl\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.865568 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerStarted","Data":"c79b6edd4cfda87ea421c32d285c647cbb1a7abc9ac9bbca0f709555407695b0"} Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.868337 4891 generic.go:334] "Generic (PLEG): container finished" podID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerID="c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c" exitCode=0 Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.868438 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" event={"ID":"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4","Type":"ContainerDied","Data":"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c"} Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.868481 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" event={"ID":"dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4","Type":"ContainerDied","Data":"fb2e92c62b0ed0fa59021e17c9f1e50c03ea5db71fe71fbf00212cd6db7e45a9"} Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.868518 4891 scope.go:117] "RemoveContainer" containerID="c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.868764 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-hb5lh" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.873369 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4tx5q" event={"ID":"15c459e2-314b-44bd-9e6a-7ae0b907b4b7","Type":"ContainerStarted","Data":"9f78c704768ceb5f8c3bfc3e9306f10ff8c14f68a4d412f0ba7036b8319e5497"} Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.873506 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4tx5q" event={"ID":"15c459e2-314b-44bd-9e6a-7ae0b907b4b7","Type":"ContainerStarted","Data":"447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85"} Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.898115 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4tx5q" podStartSLOduration=1.8980986789999998 podStartE2EDuration="1.898098679s" podCreationTimestamp="2025-09-29 10:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:17.88796402 +0000 UTC m=+1228.093132341" watchObservedRunningTime="2025-09-29 10:08:17.898098679 +0000 UTC m=+1228.103267000" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.903944 4891 scope.go:117] "RemoveContainer" containerID="af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.918826 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.933573 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-hb5lh"] Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.951984 4891 scope.go:117] "RemoveContainer" containerID="c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c" Sep 29 10:08:17 crc kubenswrapper[4891]: E0929 10:08:17.953870 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c\": container with ID starting with c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c not found: ID does not exist" containerID="c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.953917 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c"} err="failed to get container status \"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c\": rpc error: code = NotFound desc = could not find container \"c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c\": container with ID starting with c08fb546c939415a96fe25c5c3a74393d04753898c97fb994d9fc9cdd74a3c1c not found: ID does not exist" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.953946 4891 scope.go:117] "RemoveContainer" containerID="af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533" Sep 29 10:08:17 crc kubenswrapper[4891]: E0929 10:08:17.956077 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533\": container with ID starting with af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533 not found: ID does not exist" containerID="af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533" Sep 29 10:08:17 crc kubenswrapper[4891]: I0929 10:08:17.956109 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533"} err="failed to get container status \"af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533\": rpc error: code = NotFound desc = could not find container \"af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533\": container with ID starting with af9f22179b80c4585857bf4d15a39ad22e0d3382386db9c8c3847791084ee533 not found: ID does not exist" Sep 29 10:08:18 crc kubenswrapper[4891]: I0929 10:08:18.408217 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" path="/var/lib/kubelet/pods/dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4/volumes" Sep 29 10:08:20 crc kubenswrapper[4891]: I0929 10:08:20.920237 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerStarted","Data":"3f2eac9d3ad64740a51fc191cb7d99f24cb7831ca63f0af7e5457050807313d8"} Sep 29 10:08:20 crc kubenswrapper[4891]: I0929 10:08:20.924453 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:08:20 crc kubenswrapper[4891]: I0929 10:08:20.967815 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.091398735 podStartE2EDuration="7.967763155s" podCreationTimestamp="2025-09-29 10:08:13 +0000 UTC" firstStartedPulling="2025-09-29 10:08:14.096802405 +0000 UTC m=+1224.301970726" lastFinishedPulling="2025-09-29 10:08:19.973166825 +0000 UTC m=+1230.178335146" observedRunningTime="2025-09-29 10:08:20.952072498 +0000 UTC m=+1231.157240899" watchObservedRunningTime="2025-09-29 10:08:20.967763155 +0000 UTC m=+1231.172931516" Sep 29 10:08:22 crc kubenswrapper[4891]: I0929 10:08:22.944688 4891 generic.go:334] "Generic (PLEG): container finished" podID="15c459e2-314b-44bd-9e6a-7ae0b907b4b7" containerID="9f78c704768ceb5f8c3bfc3e9306f10ff8c14f68a4d412f0ba7036b8319e5497" exitCode=0 Sep 29 10:08:22 crc kubenswrapper[4891]: I0929 10:08:22.944850 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4tx5q" event={"ID":"15c459e2-314b-44bd-9e6a-7ae0b907b4b7","Type":"ContainerDied","Data":"9f78c704768ceb5f8c3bfc3e9306f10ff8c14f68a4d412f0ba7036b8319e5497"} Sep 29 10:08:23 crc kubenswrapper[4891]: I0929 10:08:23.217040 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:08:23 crc kubenswrapper[4891]: I0929 10:08:23.217655 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.227023 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.244965 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.324395 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.398410 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts\") pod \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.398926 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data\") pod \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.399079 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle\") pod \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.399242 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8jpn\" (UniqueName: \"kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn\") pod \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\" (UID: \"15c459e2-314b-44bd-9e6a-7ae0b907b4b7\") " Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.407903 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn" (OuterVolumeSpecName: "kube-api-access-j8jpn") pod "15c459e2-314b-44bd-9e6a-7ae0b907b4b7" (UID: "15c459e2-314b-44bd-9e6a-7ae0b907b4b7"). InnerVolumeSpecName "kube-api-access-j8jpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.408226 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts" (OuterVolumeSpecName: "scripts") pod "15c459e2-314b-44bd-9e6a-7ae0b907b4b7" (UID: "15c459e2-314b-44bd-9e6a-7ae0b907b4b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.433992 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15c459e2-314b-44bd-9e6a-7ae0b907b4b7" (UID: "15c459e2-314b-44bd-9e6a-7ae0b907b4b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.449490 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data" (OuterVolumeSpecName: "config-data") pod "15c459e2-314b-44bd-9e6a-7ae0b907b4b7" (UID: "15c459e2-314b-44bd-9e6a-7ae0b907b4b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.502912 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.503848 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.503892 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.503934 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8jpn\" (UniqueName: \"kubernetes.io/projected/15c459e2-314b-44bd-9e6a-7ae0b907b4b7-kube-api-access-j8jpn\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.965111 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4tx5q" event={"ID":"15c459e2-314b-44bd-9e6a-7ae0b907b4b7","Type":"ContainerDied","Data":"447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85"} Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.965737 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447a51252f9dec44dbb84e0451585c3295af00b4c854668f9e5f23d7629b6c85" Sep 29 10:08:24 crc kubenswrapper[4891]: I0929 10:08:24.965210 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4tx5q" Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.176367 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.176848 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-log" containerID="cri-o://f9776fa1bb9f8c227a8826ae796e21cde64af83d1dd806d5cdd0e999972f5aae" gracePeriod=30 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.176986 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-api" containerID="cri-o://3abecfeae70fc5eb763ea1154d6f4257217020142608f34ade65ff037077926e" gracePeriod=30 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.225205 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.225491 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerName="nova-scheduler-scheduler" containerID="cri-o://264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" gracePeriod=30 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.244833 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.246121 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" containerID="cri-o://c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67" gracePeriod=30 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.246239 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" containerID="cri-o://a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902" gracePeriod=30 Sep 29 10:08:25 crc kubenswrapper[4891]: E0929 10:08:25.750962 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:08:25 crc kubenswrapper[4891]: E0929 10:08:25.753001 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:08:25 crc kubenswrapper[4891]: E0929 10:08:25.754316 4891 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:08:25 crc kubenswrapper[4891]: E0929 10:08:25.754534 4891 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerName="nova-scheduler-scheduler" Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.986828 4891 generic.go:334] "Generic (PLEG): container finished" podID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerID="c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67" exitCode=143 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.986901 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerDied","Data":"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67"} Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.989469 4891 generic.go:334] "Generic (PLEG): container finished" podID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerID="f9776fa1bb9f8c227a8826ae796e21cde64af83d1dd806d5cdd0e999972f5aae" exitCode=143 Sep 29 10:08:25 crc kubenswrapper[4891]: I0929 10:08:25.989516 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerDied","Data":"f9776fa1bb9f8c227a8826ae796e21cde64af83d1dd806d5cdd0e999972f5aae"} Sep 29 10:08:28 crc kubenswrapper[4891]: I0929 10:08:28.391264 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:47696->10.217.0.194:8775: read: connection reset by peer" Sep 29 10:08:28 crc kubenswrapper[4891]: I0929 10:08:28.391284 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:47698->10.217.0.194:8775: read: connection reset by peer" Sep 29 10:08:28 crc kubenswrapper[4891]: I0929 10:08:28.878837 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.009823 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs\") pod \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.009914 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djd7s\" (UniqueName: \"kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s\") pod \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.009944 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data\") pod \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.010026 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle\") pod \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.010077 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs\") pod \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\" (UID: \"f8b5291b-8d02-4e53-acdf-4e42b181ec2a\") " Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.010437 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs" (OuterVolumeSpecName: "logs") pod "f8b5291b-8d02-4e53-acdf-4e42b181ec2a" (UID: "f8b5291b-8d02-4e53-acdf-4e42b181ec2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.016395 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.026476 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s" (OuterVolumeSpecName: "kube-api-access-djd7s") pod "f8b5291b-8d02-4e53-acdf-4e42b181ec2a" (UID: "f8b5291b-8d02-4e53-acdf-4e42b181ec2a"). InnerVolumeSpecName "kube-api-access-djd7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.030159 4891 generic.go:334] "Generic (PLEG): container finished" podID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerID="a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902" exitCode=0 Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.030217 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.030215 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerDied","Data":"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902"} Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.030352 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8b5291b-8d02-4e53-acdf-4e42b181ec2a","Type":"ContainerDied","Data":"d86e8feb654b9ef239ae43c6eb7710155de4f3ee92544bff2cb09af64667a3dc"} Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.030379 4891 scope.go:117] "RemoveContainer" containerID="a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.053290 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data" (OuterVolumeSpecName: "config-data") pod "f8b5291b-8d02-4e53-acdf-4e42b181ec2a" (UID: "f8b5291b-8d02-4e53-acdf-4e42b181ec2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.056162 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b5291b-8d02-4e53-acdf-4e42b181ec2a" (UID: "f8b5291b-8d02-4e53-acdf-4e42b181ec2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.099377 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f8b5291b-8d02-4e53-acdf-4e42b181ec2a" (UID: "f8b5291b-8d02-4e53-acdf-4e42b181ec2a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.118811 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djd7s\" (UniqueName: \"kubernetes.io/projected/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-kube-api-access-djd7s\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.118850 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.118863 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.118872 4891 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5291b-8d02-4e53-acdf-4e42b181ec2a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.146198 4891 scope.go:117] "RemoveContainer" containerID="c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.169949 4891 scope.go:117] "RemoveContainer" containerID="a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.170459 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902\": container with ID starting with a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902 not found: ID does not exist" containerID="a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.170529 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902"} err="failed to get container status \"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902\": rpc error: code = NotFound desc = could not find container \"a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902\": container with ID starting with a062639fe6d8e14f48c12ff4cb49b4911d1f1a2b4d169cd102e15dc3d4314902 not found: ID does not exist" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.170579 4891 scope.go:117] "RemoveContainer" containerID="c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.171028 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67\": container with ID starting with c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67 not found: ID does not exist" containerID="c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.171057 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67"} err="failed to get container status \"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67\": rpc error: code = NotFound desc = could not find container \"c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67\": container with ID starting with c5a5e5dedd6c6b1cc7f2f2168320ac975aab9bed5e8821e015c719c2ff2f8a67 not found: ID does not exist" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.376834 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.385690 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403337 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.403763 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c459e2-314b-44bd-9e6a-7ae0b907b4b7" containerName="nova-manage" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403778 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c459e2-314b-44bd-9e6a-7ae0b907b4b7" containerName="nova-manage" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.403872 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="dnsmasq-dns" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403880 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="dnsmasq-dns" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.403892 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403899 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.403910 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="init" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403916 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="init" Sep 29 10:08:29 crc kubenswrapper[4891]: E0929 10:08:29.403928 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.403933 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.404116 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-log" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.404137 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdfc2d8-bf46-4c65-91d8-a6dbe38d99d4" containerName="dnsmasq-dns" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.404146 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" containerName="nova-metadata-metadata" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.404156 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c459e2-314b-44bd-9e6a-7ae0b907b4b7" containerName="nova-manage" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.405138 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.407749 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.409104 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.421641 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.528673 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhm5c\" (UniqueName: \"kubernetes.io/projected/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-kube-api-access-dhm5c\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.528767 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.528815 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-config-data\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.528979 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.529003 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-logs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.631433 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.631484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-logs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.631529 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhm5c\" (UniqueName: \"kubernetes.io/projected/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-kube-api-access-dhm5c\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.631567 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.631585 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-config-data\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.632124 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-logs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.643419 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.647887 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-config-data\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.648527 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.652256 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhm5c\" (UniqueName: \"kubernetes.io/projected/17f1cdf8-c8a7-42b7-a864-e89db1b08cb7-kube-api-access-dhm5c\") pod \"nova-metadata-0\" (UID: \"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7\") " pod="openstack/nova-metadata-0" Sep 29 10:08:29 crc kubenswrapper[4891]: I0929 10:08:29.723379 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.042224 4891 generic.go:334] "Generic (PLEG): container finished" podID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerID="3abecfeae70fc5eb763ea1154d6f4257217020142608f34ade65ff037077926e" exitCode=0 Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.042302 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerDied","Data":"3abecfeae70fc5eb763ea1154d6f4257217020142608f34ade65ff037077926e"} Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.045282 4891 generic.go:334] "Generic (PLEG): container finished" podID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerID="264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" exitCode=0 Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.045327 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c501d405-3ec0-4276-bf49-cb633ede21fe","Type":"ContainerDied","Data":"264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204"} Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.045348 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c501d405-3ec0-4276-bf49-cb633ede21fe","Type":"ContainerDied","Data":"7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df"} Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.045359 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7af06c5f4cd0743117a1f2365b8827bbe3224452a18ad6abe438e85bda17e7df" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.054778 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.097143 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.142762 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data\") pod \"c501d405-3ec0-4276-bf49-cb633ede21fe\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.142944 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle\") pod \"c501d405-3ec0-4276-bf49-cb633ede21fe\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.143039 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsnnv\" (UniqueName: \"kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv\") pod \"c501d405-3ec0-4276-bf49-cb633ede21fe\" (UID: \"c501d405-3ec0-4276-bf49-cb633ede21fe\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.149117 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv" (OuterVolumeSpecName: "kube-api-access-vsnnv") pod "c501d405-3ec0-4276-bf49-cb633ede21fe" (UID: "c501d405-3ec0-4276-bf49-cb633ede21fe"). InnerVolumeSpecName "kube-api-access-vsnnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.170412 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c501d405-3ec0-4276-bf49-cb633ede21fe" (UID: "c501d405-3ec0-4276-bf49-cb633ede21fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.175934 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data" (OuterVolumeSpecName: "config-data") pod "c501d405-3ec0-4276-bf49-cb633ede21fe" (UID: "c501d405-3ec0-4276-bf49-cb633ede21fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.244455 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245171 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245434 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245619 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245814 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw9bc\" (UniqueName: \"kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245942 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs\") pod \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\" (UID: \"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61\") " Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.245878 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs" (OuterVolumeSpecName: "logs") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.246997 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.247081 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsnnv\" (UniqueName: \"kubernetes.io/projected/c501d405-3ec0-4276-bf49-cb633ede21fe-kube-api-access-vsnnv\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.247143 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c501d405-3ec0-4276-bf49-cb633ede21fe-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.247199 4891 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.252945 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc" (OuterVolumeSpecName: "kube-api-access-zw9bc") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "kube-api-access-zw9bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.255780 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:08:30 crc kubenswrapper[4891]: W0929 10:08:30.257579 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17f1cdf8_c8a7_42b7_a864_e89db1b08cb7.slice/crio-5573380ad1650035627fbed70f0df59d8727755912b89f14a7f0f89a8655c1f5 WatchSource:0}: Error finding container 5573380ad1650035627fbed70f0df59d8727755912b89f14a7f0f89a8655c1f5: Status 404 returned error can't find the container with id 5573380ad1650035627fbed70f0df59d8727755912b89f14a7f0f89a8655c1f5 Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.272656 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.284888 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data" (OuterVolumeSpecName: "config-data") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.303076 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.306022 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" (UID: "3acc3f25-d4d9-44d9-87c8-69d5f5b51e61"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.350173 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.350587 4891 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.350604 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.350650 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw9bc\" (UniqueName: \"kubernetes.io/projected/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-kube-api-access-zw9bc\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.350664 4891 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:30 crc kubenswrapper[4891]: I0929 10:08:30.418053 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b5291b-8d02-4e53-acdf-4e42b181ec2a" path="/var/lib/kubelet/pods/f8b5291b-8d02-4e53-acdf-4e42b181ec2a/volumes" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.059177 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3acc3f25-d4d9-44d9-87c8-69d5f5b51e61","Type":"ContainerDied","Data":"c74b9a2d1e212af50dcdc7c8c9a3924d9fd63d794653b06bdd7c10e7948ff41c"} Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.059485 4891 scope.go:117] "RemoveContainer" containerID="3abecfeae70fc5eb763ea1154d6f4257217020142608f34ade65ff037077926e" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.059268 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.061539 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.061556 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7","Type":"ContainerStarted","Data":"ff3e07016b6c91aacfd590c6684f1a69aca6162861ce2524a6c6df858b9e9e17"} Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.061580 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7","Type":"ContainerStarted","Data":"6e3b9a3725d71f31293e937e5bb9389cf8f46939454eb64c53326d270a9f15f6"} Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.061592 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17f1cdf8-c8a7-42b7-a864-e89db1b08cb7","Type":"ContainerStarted","Data":"5573380ad1650035627fbed70f0df59d8727755912b89f14a7f0f89a8655c1f5"} Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.098156 4891 scope.go:117] "RemoveContainer" containerID="f9776fa1bb9f8c227a8826ae796e21cde64af83d1dd806d5cdd0e999972f5aae" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.113736 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.134885 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.146802 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: E0929 10:08:31.147339 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-api" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147366 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-api" Sep 29 10:08:31 crc kubenswrapper[4891]: E0929 10:08:31.147389 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-log" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147399 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-log" Sep 29 10:08:31 crc kubenswrapper[4891]: E0929 10:08:31.147437 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerName="nova-scheduler-scheduler" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147446 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerName="nova-scheduler-scheduler" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147703 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" containerName="nova-scheduler-scheduler" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147724 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-log" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.147739 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" containerName="nova-api-api" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.148870 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.157078 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.1570608079999998 podStartE2EDuration="2.157060808s" podCreationTimestamp="2025-09-29 10:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:31.120132756 +0000 UTC m=+1241.325301087" watchObservedRunningTime="2025-09-29 10:08:31.157060808 +0000 UTC m=+1241.362229139" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.157122 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.182535 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.197350 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.204950 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.213971 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.215746 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.218076 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.218259 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.218821 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.240134 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.272116 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slgz\" (UniqueName: \"kubernetes.io/projected/067392b2-a609-44b9-8796-26df77b11d9e-kube-api-access-4slgz\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.272182 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.272444 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-config-data\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376426 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-config-data\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376486 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376628 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376673 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-config-data\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376771 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376904 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slgz\" (UniqueName: \"kubernetes.io/projected/067392b2-a609-44b9-8796-26df77b11d9e-kube-api-access-4slgz\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376928 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376954 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69lz\" (UniqueName: \"kubernetes.io/projected/ae3bbc79-5ed8-4064-bc90-554ca707171b-kube-api-access-z69lz\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.376980 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3bbc79-5ed8-4064-bc90-554ca707171b-logs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.381317 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-config-data\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.382339 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067392b2-a609-44b9-8796-26df77b11d9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.413821 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slgz\" (UniqueName: \"kubernetes.io/projected/067392b2-a609-44b9-8796-26df77b11d9e-kube-api-access-4slgz\") pod \"nova-scheduler-0\" (UID: \"067392b2-a609-44b9-8796-26df77b11d9e\") " pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479385 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479434 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69lz\" (UniqueName: \"kubernetes.io/projected/ae3bbc79-5ed8-4064-bc90-554ca707171b-kube-api-access-z69lz\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479462 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3bbc79-5ed8-4064-bc90-554ca707171b-logs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-config-data\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479555 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.479618 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.480248 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae3bbc79-5ed8-4064-bc90-554ca707171b-logs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.494018 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.494355 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.494881 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.494368 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.495029 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3bbc79-5ed8-4064-bc90-554ca707171b-config-data\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.499545 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69lz\" (UniqueName: \"kubernetes.io/projected/ae3bbc79-5ed8-4064-bc90-554ca707171b-kube-api-access-z69lz\") pod \"nova-api-0\" (UID: \"ae3bbc79-5ed8-4064-bc90-554ca707171b\") " pod="openstack/nova-api-0" Sep 29 10:08:31 crc kubenswrapper[4891]: I0929 10:08:31.531318 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:08:32 crc kubenswrapper[4891]: I0929 10:08:32.006127 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:08:32 crc kubenswrapper[4891]: W0929 10:08:32.015172 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod067392b2_a609_44b9_8796_26df77b11d9e.slice/crio-581f80a224714075200cc4767e8e99f602a197a78481b7f01d97f43bd7822868 WatchSource:0}: Error finding container 581f80a224714075200cc4767e8e99f602a197a78481b7f01d97f43bd7822868: Status 404 returned error can't find the container with id 581f80a224714075200cc4767e8e99f602a197a78481b7f01d97f43bd7822868 Sep 29 10:08:32 crc kubenswrapper[4891]: I0929 10:08:32.079031 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067392b2-a609-44b9-8796-26df77b11d9e","Type":"ContainerStarted","Data":"581f80a224714075200cc4767e8e99f602a197a78481b7f01d97f43bd7822868"} Sep 29 10:08:32 crc kubenswrapper[4891]: I0929 10:08:32.087969 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:08:32 crc kubenswrapper[4891]: W0929 10:08:32.094755 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae3bbc79_5ed8_4064_bc90_554ca707171b.slice/crio-5b169a385b50e936e472b6d2abc72cf7714880b5c099339f3bf7df4d443946cb WatchSource:0}: Error finding container 5b169a385b50e936e472b6d2abc72cf7714880b5c099339f3bf7df4d443946cb: Status 404 returned error can't find the container with id 5b169a385b50e936e472b6d2abc72cf7714880b5c099339f3bf7df4d443946cb Sep 29 10:08:32 crc kubenswrapper[4891]: I0929 10:08:32.417444 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acc3f25-d4d9-44d9-87c8-69d5f5b51e61" path="/var/lib/kubelet/pods/3acc3f25-d4d9-44d9-87c8-69d5f5b51e61/volumes" Sep 29 10:08:32 crc kubenswrapper[4891]: I0929 10:08:32.418547 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c501d405-3ec0-4276-bf49-cb633ede21fe" path="/var/lib/kubelet/pods/c501d405-3ec0-4276-bf49-cb633ede21fe/volumes" Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.105088 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae3bbc79-5ed8-4064-bc90-554ca707171b","Type":"ContainerStarted","Data":"e94c37bd714a294df8064359f14bf6b38687dcc4999cbe0523f7e3f3c854e480"} Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.107007 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae3bbc79-5ed8-4064-bc90-554ca707171b","Type":"ContainerStarted","Data":"cda1f99c393a43c90e795100929b9c158fc005e7386530ca6f624febd62148c2"} Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.107088 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae3bbc79-5ed8-4064-bc90-554ca707171b","Type":"ContainerStarted","Data":"5b169a385b50e936e472b6d2abc72cf7714880b5c099339f3bf7df4d443946cb"} Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.109125 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067392b2-a609-44b9-8796-26df77b11d9e","Type":"ContainerStarted","Data":"78a793485850b8cef60a34da439a9b9c269467379186c0089c11f9c4b06030b3"} Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.167259 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.167229986 podStartE2EDuration="2.167229986s" podCreationTimestamp="2025-09-29 10:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:33.142682196 +0000 UTC m=+1243.347850567" watchObservedRunningTime="2025-09-29 10:08:33.167229986 +0000 UTC m=+1243.372398317" Sep 29 10:08:33 crc kubenswrapper[4891]: I0929 10:08:33.197504 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.197481348 podStartE2EDuration="2.197481348s" podCreationTimestamp="2025-09-29 10:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:08:33.186428233 +0000 UTC m=+1243.391596574" watchObservedRunningTime="2025-09-29 10:08:33.197481348 +0000 UTC m=+1243.402649679" Sep 29 10:08:34 crc kubenswrapper[4891]: I0929 10:08:34.723564 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:08:34 crc kubenswrapper[4891]: I0929 10:08:34.727515 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:08:36 crc kubenswrapper[4891]: I0929 10:08:36.496082 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:08:39 crc kubenswrapper[4891]: I0929 10:08:39.724399 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:08:39 crc kubenswrapper[4891]: I0929 10:08:39.724890 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:08:40 crc kubenswrapper[4891]: I0929 10:08:40.742968 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="17f1cdf8-c8a7-42b7-a864-e89db1b08cb7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:40 crc kubenswrapper[4891]: I0929 10:08:40.742992 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="17f1cdf8-c8a7-42b7-a864-e89db1b08cb7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:41 crc kubenswrapper[4891]: I0929 10:08:41.495827 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:08:41 crc kubenswrapper[4891]: I0929 10:08:41.527327 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:08:41 crc kubenswrapper[4891]: I0929 10:08:41.532738 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:08:41 crc kubenswrapper[4891]: I0929 10:08:41.534371 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:08:42 crc kubenswrapper[4891]: I0929 10:08:42.306699 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:08:42 crc kubenswrapper[4891]: I0929 10:08:42.570230 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae3bbc79-5ed8-4064-bc90-554ca707171b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:42 crc kubenswrapper[4891]: I0929 10:08:42.570262 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae3bbc79-5ed8-4064-bc90-554ca707171b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:08:43 crc kubenswrapper[4891]: I0929 10:08:43.630276 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 10:08:47 crc kubenswrapper[4891]: I0929 10:08:47.201459 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:47 crc kubenswrapper[4891]: I0929 10:08:47.202236 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d268201e-fc68-403d-958c-2b402143c96e" containerName="kube-state-metrics" containerID="cri-o://65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1" gracePeriod=30 Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:47.662704 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:47.799679 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg69z\" (UniqueName: \"kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z\") pod \"d268201e-fc68-403d-958c-2b402143c96e\" (UID: \"d268201e-fc68-403d-958c-2b402143c96e\") " Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:47.807240 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z" (OuterVolumeSpecName: "kube-api-access-fg69z") pod "d268201e-fc68-403d-958c-2b402143c96e" (UID: "d268201e-fc68-403d-958c-2b402143c96e"). InnerVolumeSpecName "kube-api-access-fg69z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:47.902719 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg69z\" (UniqueName: \"kubernetes.io/projected/d268201e-fc68-403d-958c-2b402143c96e-kube-api-access-fg69z\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.350346 4891 generic.go:334] "Generic (PLEG): container finished" podID="d268201e-fc68-403d-958c-2b402143c96e" containerID="65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1" exitCode=2 Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.350952 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.351838 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d268201e-fc68-403d-958c-2b402143c96e","Type":"ContainerDied","Data":"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1"} Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.351879 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d268201e-fc68-403d-958c-2b402143c96e","Type":"ContainerDied","Data":"6883d682bfb090e2f87bcfc7dc96bc5715d0aff173ffb529590ffadb24083203"} Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.351900 4891 scope.go:117] "RemoveContainer" containerID="65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.383024 4891 scope.go:117] "RemoveContainer" containerID="65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1" Sep 29 10:08:48 crc kubenswrapper[4891]: E0929 10:08:48.384000 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1\": container with ID starting with 65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1 not found: ID does not exist" containerID="65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.384114 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1"} err="failed to get container status \"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1\": rpc error: code = NotFound desc = could not find container \"65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1\": container with ID starting with 65060e9ccddfd75e6fc9883ecdabb867faded12a77979acb5788676ed42261a1 not found: ID does not exist" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.431560 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.431838 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.431878 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:48 crc kubenswrapper[4891]: E0929 10:08:48.432406 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d268201e-fc68-403d-958c-2b402143c96e" containerName="kube-state-metrics" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.432440 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="d268201e-fc68-403d-958c-2b402143c96e" containerName="kube-state-metrics" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.432845 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="d268201e-fc68-403d-958c-2b402143c96e" containerName="kube-state-metrics" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.433979 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.436064 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.436544 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.439556 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.530443 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.530506 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.530885 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.530996 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54gc\" (UniqueName: \"kubernetes.io/projected/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-api-access-q54gc\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.631986 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.632042 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54gc\" (UniqueName: \"kubernetes.io/projected/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-api-access-q54gc\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.632111 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.632133 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.638505 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.638612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.639885 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c908de-54eb-4e12-a9a9-735fbf07c433-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.652099 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54gc\" (UniqueName: \"kubernetes.io/projected/34c908de-54eb-4e12-a9a9-735fbf07c433-kube-api-access-q54gc\") pod \"kube-state-metrics-0\" (UID: \"34c908de-54eb-4e12-a9a9-735fbf07c433\") " pod="openstack/kube-state-metrics-0" Sep 29 10:08:48 crc kubenswrapper[4891]: I0929 10:08:48.759232 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.175545 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.176380 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-central-agent" containerID="cri-o://0e19292bf3d457000f279f22b4cbcec64844d45e05e59085a0a9c3ff82e11d3a" gracePeriod=30 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.176437 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="proxy-httpd" containerID="cri-o://3f2eac9d3ad64740a51fc191cb7d99f24cb7831ca63f0af7e5457050807313d8" gracePeriod=30 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.176488 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-notification-agent" containerID="cri-o://7d2a614dd8b9665552651072927ba040e646547ba95bfaec9319b47e006b8b62" gracePeriod=30 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.176480 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="sg-core" containerID="cri-o://c79b6edd4cfda87ea421c32d285c647cbb1a7abc9ac9bbca0f709555407695b0" gracePeriod=30 Sep 29 10:08:49 crc kubenswrapper[4891]: W0929 10:08:49.240876 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c908de_54eb_4e12_a9a9_735fbf07c433.slice/crio-ec0c3ec87a332b249657787b10c17813bffe6b3f4880ebb9cafcd87b222dedc1 WatchSource:0}: Error finding container ec0c3ec87a332b249657787b10c17813bffe6b3f4880ebb9cafcd87b222dedc1: Status 404 returned error can't find the container with id ec0c3ec87a332b249657787b10c17813bffe6b3f4880ebb9cafcd87b222dedc1 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.241445 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.243491 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.363294 4891 generic.go:334] "Generic (PLEG): container finished" podID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerID="3f2eac9d3ad64740a51fc191cb7d99f24cb7831ca63f0af7e5457050807313d8" exitCode=0 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.363332 4891 generic.go:334] "Generic (PLEG): container finished" podID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerID="c79b6edd4cfda87ea421c32d285c647cbb1a7abc9ac9bbca0f709555407695b0" exitCode=2 Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.363370 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerDied","Data":"3f2eac9d3ad64740a51fc191cb7d99f24cb7831ca63f0af7e5457050807313d8"} Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.363432 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerDied","Data":"c79b6edd4cfda87ea421c32d285c647cbb1a7abc9ac9bbca0f709555407695b0"} Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.371334 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34c908de-54eb-4e12-a9a9-735fbf07c433","Type":"ContainerStarted","Data":"ec0c3ec87a332b249657787b10c17813bffe6b3f4880ebb9cafcd87b222dedc1"} Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.730750 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.735737 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:08:49 crc kubenswrapper[4891]: I0929 10:08:49.736460 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.387558 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34c908de-54eb-4e12-a9a9-735fbf07c433","Type":"ContainerStarted","Data":"d254e5e4d78176ebb09e545fdb94d71cafe6a94de7279936528a5ff73564462f"} Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.387650 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.390026 4891 generic.go:334] "Generic (PLEG): container finished" podID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerID="0e19292bf3d457000f279f22b4cbcec64844d45e05e59085a0a9c3ff82e11d3a" exitCode=0 Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.390354 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerDied","Data":"0e19292bf3d457000f279f22b4cbcec64844d45e05e59085a0a9c3ff82e11d3a"} Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.420430 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.97411322 podStartE2EDuration="2.420411676s" podCreationTimestamp="2025-09-29 10:08:48 +0000 UTC" firstStartedPulling="2025-09-29 10:08:49.243261575 +0000 UTC m=+1259.448429896" lastFinishedPulling="2025-09-29 10:08:49.689560031 +0000 UTC m=+1259.894728352" observedRunningTime="2025-09-29 10:08:50.411206684 +0000 UTC m=+1260.616375025" watchObservedRunningTime="2025-09-29 10:08:50.420411676 +0000 UTC m=+1260.625579997" Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.421714 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d268201e-fc68-403d-958c-2b402143c96e" path="/var/lib/kubelet/pods/d268201e-fc68-403d-958c-2b402143c96e/volumes" Sep 29 10:08:50 crc kubenswrapper[4891]: I0929 10:08:50.424943 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:08:51 crc kubenswrapper[4891]: I0929 10:08:51.541668 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:08:51 crc kubenswrapper[4891]: I0929 10:08:51.542231 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:08:51 crc kubenswrapper[4891]: I0929 10:08:51.542277 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:08:51 crc kubenswrapper[4891]: I0929 10:08:51.548587 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:08:52 crc kubenswrapper[4891]: I0929 10:08:52.410318 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:08:52 crc kubenswrapper[4891]: I0929 10:08:52.416517 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:08:53 crc kubenswrapper[4891]: I0929 10:08:53.422646 4891 generic.go:334] "Generic (PLEG): container finished" podID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerID="7d2a614dd8b9665552651072927ba040e646547ba95bfaec9319b47e006b8b62" exitCode=0 Sep 29 10:08:53 crc kubenswrapper[4891]: I0929 10:08:53.422738 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerDied","Data":"7d2a614dd8b9665552651072927ba040e646547ba95bfaec9319b47e006b8b62"} Sep 29 10:08:53 crc kubenswrapper[4891]: I0929 10:08:53.981994 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.149774 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.149831 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.149882 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.149988 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.150081 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.150766 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.150997 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4jsj\" (UniqueName: \"kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.151139 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd\") pod \"c908bd0e-e209-465b-b98e-f892a4a270f8\" (UID: \"c908bd0e-e209-465b-b98e-f892a4a270f8\") " Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.151503 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.152056 4891 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.152133 4891 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c908bd0e-e209-465b-b98e-f892a4a270f8-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.157028 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts" (OuterVolumeSpecName: "scripts") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.157172 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj" (OuterVolumeSpecName: "kube-api-access-v4jsj") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "kube-api-access-v4jsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.182700 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.233893 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.254845 4891 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.254890 4891 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.254904 4891 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.254918 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4jsj\" (UniqueName: \"kubernetes.io/projected/c908bd0e-e209-465b-b98e-f892a4a270f8-kube-api-access-v4jsj\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.257668 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data" (OuterVolumeSpecName: "config-data") pod "c908bd0e-e209-465b-b98e-f892a4a270f8" (UID: "c908bd0e-e209-465b-b98e-f892a4a270f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.356526 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c908bd0e-e209-465b-b98e-f892a4a270f8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.437308 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.438647 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c908bd0e-e209-465b-b98e-f892a4a270f8","Type":"ContainerDied","Data":"2e378fd9fce92a03c84200e0004270fa67af34706e747cf40cd8933f5d8de934"} Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.438725 4891 scope.go:117] "RemoveContainer" containerID="3f2eac9d3ad64740a51fc191cb7d99f24cb7831ca63f0af7e5457050807313d8" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.476017 4891 scope.go:117] "RemoveContainer" containerID="c79b6edd4cfda87ea421c32d285c647cbb1a7abc9ac9bbca0f709555407695b0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.495379 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.513349 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.522231 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:54 crc kubenswrapper[4891]: E0929 10:08:54.522910 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-notification-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.522936 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-notification-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: E0929 10:08:54.522955 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="sg-core" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.522964 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="sg-core" Sep 29 10:08:54 crc kubenswrapper[4891]: E0929 10:08:54.522996 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="proxy-httpd" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523005 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="proxy-httpd" Sep 29 10:08:54 crc kubenswrapper[4891]: E0929 10:08:54.523033 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-central-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523040 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-central-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523400 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="sg-core" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523436 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="proxy-httpd" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523452 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-central-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.523465 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" containerName="ceilometer-notification-agent" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.524464 4891 scope.go:117] "RemoveContainer" containerID="7d2a614dd8b9665552651072927ba040e646547ba95bfaec9319b47e006b8b62" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.531064 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.532947 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.534830 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.535048 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.535195 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.555717 4891 scope.go:117] "RemoveContainer" containerID="0e19292bf3d457000f279f22b4cbcec64844d45e05e59085a0a9c3ff82e11d3a" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.666932 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.666990 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5lr\" (UniqueName: \"kubernetes.io/projected/e7f919b2-f640-44b6-83f9-f870057ba63a-kube-api-access-7w5lr\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667057 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-scripts\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667092 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667126 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-config-data\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667162 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-log-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667191 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.667252 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-run-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.768640 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-run-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.768982 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5lr\" (UniqueName: \"kubernetes.io/projected/e7f919b2-f640-44b6-83f9-f870057ba63a-kube-api-access-7w5lr\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769004 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769045 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-scripts\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769067 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769089 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-config-data\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769120 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-log-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769139 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769484 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-run-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.769752 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7f919b2-f640-44b6-83f9-f870057ba63a-log-httpd\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.774385 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-config-data\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.774673 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.775422 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.775458 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.788872 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f919b2-f640-44b6-83f9-f870057ba63a-scripts\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.791813 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5lr\" (UniqueName: \"kubernetes.io/projected/e7f919b2-f640-44b6-83f9-f870057ba63a-kube-api-access-7w5lr\") pod \"ceilometer-0\" (UID: \"e7f919b2-f640-44b6-83f9-f870057ba63a\") " pod="openstack/ceilometer-0" Sep 29 10:08:54 crc kubenswrapper[4891]: I0929 10:08:54.861743 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:08:55 crc kubenswrapper[4891]: I0929 10:08:55.421323 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:08:55 crc kubenswrapper[4891]: I0929 10:08:55.448293 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7f919b2-f640-44b6-83f9-f870057ba63a","Type":"ContainerStarted","Data":"2e15c807c5dad8d006d3d2cbee752d93ef2c780e2766a9f6cf541c0a9d0b7af1"} Sep 29 10:08:56 crc kubenswrapper[4891]: I0929 10:08:56.409953 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c908bd0e-e209-465b-b98e-f892a4a270f8" path="/var/lib/kubelet/pods/c908bd0e-e209-465b-b98e-f892a4a270f8/volumes" Sep 29 10:08:57 crc kubenswrapper[4891]: I0929 10:08:57.489710 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7f919b2-f640-44b6-83f9-f870057ba63a","Type":"ContainerStarted","Data":"f1c111a60792ede11a2b48c016083bcf9f626370ec77b9931005480fa19b6a35"} Sep 29 10:08:58 crc kubenswrapper[4891]: I0929 10:08:58.521769 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7f919b2-f640-44b6-83f9-f870057ba63a","Type":"ContainerStarted","Data":"607c466f0cb14279c6d1ad89b85346ed453b824635b057710c9730ec6d4146df"} Sep 29 10:08:58 crc kubenswrapper[4891]: I0929 10:08:58.522774 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7f919b2-f640-44b6-83f9-f870057ba63a","Type":"ContainerStarted","Data":"2b6dfbd3d6a810b7ab54fa85c329b02e204c66467fd549ddd9f414a06d9c376c"} Sep 29 10:08:58 crc kubenswrapper[4891]: I0929 10:08:58.775699 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 10:09:00 crc kubenswrapper[4891]: I0929 10:09:00.542440 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7f919b2-f640-44b6-83f9-f870057ba63a","Type":"ContainerStarted","Data":"1a3d3953630500b221196c9826069771882a518706c2031f671307fd6c397018"} Sep 29 10:09:00 crc kubenswrapper[4891]: I0929 10:09:00.544099 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:09:00 crc kubenswrapper[4891]: I0929 10:09:00.576687 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.055866362 podStartE2EDuration="6.576667868s" podCreationTimestamp="2025-09-29 10:08:54 +0000 UTC" firstStartedPulling="2025-09-29 10:08:55.420056365 +0000 UTC m=+1265.625224676" lastFinishedPulling="2025-09-29 10:08:59.940857861 +0000 UTC m=+1270.146026182" observedRunningTime="2025-09-29 10:09:00.572962072 +0000 UTC m=+1270.778130403" watchObservedRunningTime="2025-09-29 10:09:00.576667868 +0000 UTC m=+1270.781836189" Sep 29 10:09:24 crc kubenswrapper[4891]: I0929 10:09:24.874712 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 10:09:34 crc kubenswrapper[4891]: I0929 10:09:34.637511 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:35 crc kubenswrapper[4891]: I0929 10:09:35.630976 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:39 crc kubenswrapper[4891]: I0929 10:09:39.143085 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="rabbitmq" containerID="cri-o://65a17fedb0791dbd519736c2d594ac4926635a0b555f8613661cd2db2bb0554b" gracePeriod=604796 Sep 29 10:09:39 crc kubenswrapper[4891]: I0929 10:09:39.323018 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Sep 29 10:09:39 crc kubenswrapper[4891]: I0929 10:09:39.546339 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="rabbitmq" containerID="cri-o://118aac32b72509b12aeebfcc8ea409e79f304c4dcb0b284ed327e6f8bff96902" gracePeriod=604797 Sep 29 10:09:39 crc kubenswrapper[4891]: I0929 10:09:39.597285 4891 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Sep 29 10:09:44 crc kubenswrapper[4891]: E0929 10:09:44.891129 4891 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.151:54392->38.102.83.151:34867: read tcp 38.102.83.151:54392->38.102.83.151:34867: read: connection reset by peer Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.081190 4891 generic.go:334] "Generic (PLEG): container finished" podID="8fd6ea18-7472-42de-b949-140181cd55a5" containerID="118aac32b72509b12aeebfcc8ea409e79f304c4dcb0b284ed327e6f8bff96902" exitCode=0 Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.081263 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerDied","Data":"118aac32b72509b12aeebfcc8ea409e79f304c4dcb0b284ed327e6f8bff96902"} Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.085438 4891 generic.go:334] "Generic (PLEG): container finished" podID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerID="65a17fedb0791dbd519736c2d594ac4926635a0b555f8613661cd2db2bb0554b" exitCode=0 Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.085522 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerDied","Data":"65a17fedb0791dbd519736c2d594ac4926635a0b555f8613661cd2db2bb0554b"} Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.416652 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.424374 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.547846 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.547911 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.547950 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-286m5\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.547983 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548004 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548080 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548115 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8w5\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548137 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548164 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548194 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548218 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548674 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.548760 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549093 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549125 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549167 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549200 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549247 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549274 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549295 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549336 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie\") pod \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\" (UID: \"f895d522-5026-4d72-862e-1a2b1bd5ee3c\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549399 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.549435 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins\") pod \"8fd6ea18-7472-42de-b949-140181cd55a5\" (UID: \"8fd6ea18-7472-42de-b949-140181cd55a5\") " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.550267 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.551092 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.551549 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.560533 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.562442 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.562780 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.565058 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.565104 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.566987 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5" (OuterVolumeSpecName: "kube-api-access-gp8w5") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "kube-api-access-gp8w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.567140 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.567339 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info" (OuterVolumeSpecName: "pod-info") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.567643 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.568006 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info" (OuterVolumeSpecName: "pod-info") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.568314 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.572961 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.585554 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5" (OuterVolumeSpecName: "kube-api-access-286m5") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "kube-api-access-286m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.635623 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data" (OuterVolumeSpecName: "config-data") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.648103 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data" (OuterVolumeSpecName: "config-data") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652206 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8w5\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-kube-api-access-gp8w5\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652258 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652274 4891 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f895d522-5026-4d72-862e-1a2b1bd5ee3c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652287 4891 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd6ea18-7472-42de-b949-140181cd55a5-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652297 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652308 4891 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652319 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652331 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652341 4891 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652353 4891 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f895d522-5026-4d72-862e-1a2b1bd5ee3c-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652364 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652375 4891 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd6ea18-7472-42de-b949-140181cd55a5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652386 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652397 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652435 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652449 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-286m5\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-kube-api-access-286m5\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.652469 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.673674 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.681310 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf" (OuterVolumeSpecName: "server-conf") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.698706 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.704805 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf" (OuterVolumeSpecName: "server-conf") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.754469 4891 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd6ea18-7472-42de-b949-140181cd55a5-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.754510 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.754521 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.754531 4891 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f895d522-5026-4d72-862e-1a2b1bd5ee3c-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.772527 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8fd6ea18-7472-42de-b949-140181cd55a5" (UID: "8fd6ea18-7472-42de-b949-140181cd55a5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.823167 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f895d522-5026-4d72-862e-1a2b1bd5ee3c" (UID: "f895d522-5026-4d72-862e-1a2b1bd5ee3c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.857071 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f895d522-5026-4d72-862e-1a2b1bd5ee3c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:46 crc kubenswrapper[4891]: I0929 10:09:46.857110 4891 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd6ea18-7472-42de-b949-140181cd55a5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.109268 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd6ea18-7472-42de-b949-140181cd55a5","Type":"ContainerDied","Data":"fdcd1106271e9bcbcc843cd1a795129e21db691bfd022d93711685abd5346252"} Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.109339 4891 scope.go:117] "RemoveContainer" containerID="118aac32b72509b12aeebfcc8ea409e79f304c4dcb0b284ed327e6f8bff96902" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.109343 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.114408 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f895d522-5026-4d72-862e-1a2b1bd5ee3c","Type":"ContainerDied","Data":"c9920d452ea798d44cfe1486dec0f524944a0e058a65584caa3de2f3fe567fdb"} Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.114468 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.157075 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.161441 4891 scope.go:117] "RemoveContainer" containerID="a53fb837cafbe9f35e7bc1f41d1176489dd2cd6ae2323976c557b3e7bf630487" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.177021 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.195198 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.210478 4891 scope.go:117] "RemoveContainer" containerID="65a17fedb0791dbd519736c2d594ac4926635a0b555f8613661cd2db2bb0554b" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.210700 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.225577 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: E0929 10:09:47.226102 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="setup-container" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226115 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="setup-container" Sep 29 10:09:47 crc kubenswrapper[4891]: E0929 10:09:47.226128 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="setup-container" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226135 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="setup-container" Sep 29 10:09:47 crc kubenswrapper[4891]: E0929 10:09:47.226145 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226151 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: E0929 10:09:47.226177 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226183 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226355 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.226363 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" containerName="rabbitmq" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.233073 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.235641 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.235834 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zdhqs" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.236855 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.237065 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.237140 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.237224 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.237486 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.245402 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.268048 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.269737 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.274430 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.274759 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.275066 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.275231 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-n25k7" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.275411 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.275483 4891 scope.go:117] "RemoveContainer" containerID="f35fd94d86fef28eca621ae9c5c625c1d28e5bd5c8a371383e8e27de6992ac12" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.275549 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.279034 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.281303 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367167 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367250 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367279 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367307 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367372 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367393 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367415 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflcs\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-kube-api-access-dflcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367442 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdea1706-076e-4094-a87b-d79580a81fcd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367477 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367521 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367552 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdea1706-076e-4094-a87b-d79580a81fcd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367598 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367619 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367646 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367674 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367707 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367734 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367763 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367803 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367834 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367859 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v68h\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-kube-api-access-7v68h\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.367896 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469674 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469725 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469756 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469799 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469849 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469877 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469908 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469943 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.469974 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470000 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v68h\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-kube-api-access-7v68h\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470043 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470092 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470124 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470150 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470176 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470198 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470217 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470241 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflcs\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-kube-api-access-dflcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470262 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdea1706-076e-4094-a87b-d79580a81fcd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470293 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470333 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.470362 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdea1706-076e-4094-a87b-d79580a81fcd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.471233 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.471660 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.472415 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.472738 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.472922 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.473677 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdea1706-076e-4094-a87b-d79580a81fcd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.474716 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.476389 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.477174 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.477400 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.478057 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.480355 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.481447 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdea1706-076e-4094-a87b-d79580a81fcd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.481506 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.482119 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdea1706-076e-4094-a87b-d79580a81fcd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.483361 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.483946 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.485029 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.485254 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.489190 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.499702 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v68h\" (UniqueName: \"kubernetes.io/projected/e1fd90ef-a2cb-4931-ba68-9e302e943c2b-kube-api-access-7v68h\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.502281 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflcs\" (UniqueName: \"kubernetes.io/projected/cdea1706-076e-4094-a87b-d79580a81fcd-kube-api-access-dflcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.516974 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"e1fd90ef-a2cb-4931-ba68-9e302e943c2b\") " pod="openstack/rabbitmq-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.519239 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cdea1706-076e-4094-a87b-d79580a81fcd\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.569337 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:09:47 crc kubenswrapper[4891]: I0929 10:09:47.591899 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.082195 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:09:48 crc kubenswrapper[4891]: W0929 10:09:48.100846 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdea1706_076e_4094_a87b_d79580a81fcd.slice/crio-c07186c260bb0c0742c94bbaee6f749fdcd34f2348b45ef234a541b4f53a6bf4 WatchSource:0}: Error finding container c07186c260bb0c0742c94bbaee6f749fdcd34f2348b45ef234a541b4f53a6bf4: Status 404 returned error can't find the container with id c07186c260bb0c0742c94bbaee6f749fdcd34f2348b45ef234a541b4f53a6bf4 Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.102780 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.180878 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cdea1706-076e-4094-a87b-d79580a81fcd","Type":"ContainerStarted","Data":"c07186c260bb0c0742c94bbaee6f749fdcd34f2348b45ef234a541b4f53a6bf4"} Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.186808 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e1fd90ef-a2cb-4931-ba68-9e302e943c2b","Type":"ContainerStarted","Data":"631b2dd2f0f0e5b5dac6914fbff71d806507563baaf350055ef8321e5bcd22e0"} Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.408990 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd6ea18-7472-42de-b949-140181cd55a5" path="/var/lib/kubelet/pods/8fd6ea18-7472-42de-b949-140181cd55a5/volumes" Sep 29 10:09:48 crc kubenswrapper[4891]: I0929 10:09:48.410255 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f895d522-5026-4d72-862e-1a2b1bd5ee3c" path="/var/lib/kubelet/pods/f895d522-5026-4d72-862e-1a2b1bd5ee3c/volumes" Sep 29 10:09:49 crc kubenswrapper[4891]: I0929 10:09:49.213939 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cdea1706-076e-4094-a87b-d79580a81fcd","Type":"ContainerStarted","Data":"d5a576949a5f4be07d7cc8ed632c4fccac079e4959920076c8abd02a2d7eb53c"} Sep 29 10:09:49 crc kubenswrapper[4891]: I0929 10:09:49.217098 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e1fd90ef-a2cb-4931-ba68-9e302e943c2b","Type":"ContainerStarted","Data":"6512a6f54b5a8087c9c789e5807c683d9540ebcffb3ef24f03b37f605388dd0e"} Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.461822 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.464815 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.467515 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.472306 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569457 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569516 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569612 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569643 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569669 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpccf\" (UniqueName: \"kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569737 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.569947 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671737 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671838 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671874 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671939 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671972 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.671997 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpccf\" (UniqueName: \"kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.672027 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.673194 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.673238 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.673817 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.674334 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.675068 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.675198 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.693495 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpccf\" (UniqueName: \"kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf\") pod \"dnsmasq-dns-5576978c7c-thjk2\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:51 crc kubenswrapper[4891]: I0929 10:09:51.786623 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:52 crc kubenswrapper[4891]: W0929 10:09:52.277550 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce2ae2a_b69d_4dd6_82e8_63ea107c5817.slice/crio-02f1045ea1e5b2dcec544afa031051ac7d1d5dad9b8630b59f58b5f818fe490d WatchSource:0}: Error finding container 02f1045ea1e5b2dcec544afa031051ac7d1d5dad9b8630b59f58b5f818fe490d: Status 404 returned error can't find the container with id 02f1045ea1e5b2dcec544afa031051ac7d1d5dad9b8630b59f58b5f818fe490d Sep 29 10:09:52 crc kubenswrapper[4891]: I0929 10:09:52.286083 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:09:53 crc kubenswrapper[4891]: I0929 10:09:53.260378 4891 generic.go:334] "Generic (PLEG): container finished" podID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerID="66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2" exitCode=0 Sep 29 10:09:53 crc kubenswrapper[4891]: I0929 10:09:53.260478 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" event={"ID":"cce2ae2a-b69d-4dd6-82e8-63ea107c5817","Type":"ContainerDied","Data":"66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2"} Sep 29 10:09:53 crc kubenswrapper[4891]: I0929 10:09:53.260528 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" event={"ID":"cce2ae2a-b69d-4dd6-82e8-63ea107c5817","Type":"ContainerStarted","Data":"02f1045ea1e5b2dcec544afa031051ac7d1d5dad9b8630b59f58b5f818fe490d"} Sep 29 10:09:54 crc kubenswrapper[4891]: I0929 10:09:54.276664 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" event={"ID":"cce2ae2a-b69d-4dd6-82e8-63ea107c5817","Type":"ContainerStarted","Data":"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38"} Sep 29 10:09:54 crc kubenswrapper[4891]: I0929 10:09:54.277967 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:09:54 crc kubenswrapper[4891]: I0929 10:09:54.314660 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" podStartSLOduration=3.314638799 podStartE2EDuration="3.314638799s" podCreationTimestamp="2025-09-29 10:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:09:54.313285099 +0000 UTC m=+1324.518453440" watchObservedRunningTime="2025-09-29 10:09:54.314638799 +0000 UTC m=+1324.519807120" Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.788053 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.858162 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.858416 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="dnsmasq-dns" containerID="cri-o://8c176e2a9a45eadf839b665216c6b347cfab4bba431f2a16029c3d64ad1d0ece" gracePeriod=10 Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.973821 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-4jnvs"] Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.976176 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:01 crc kubenswrapper[4891]: I0929 10:10:01.992898 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-4jnvs"] Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008048 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008096 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kks4q\" (UniqueName: \"kubernetes.io/projected/e33fd450-15eb-4135-bfb6-c42df60defa6-kube-api-access-kks4q\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008193 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008230 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008269 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008312 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-config\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.008364 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110566 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-config\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110638 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110704 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110726 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kks4q\" (UniqueName: \"kubernetes.io/projected/e33fd450-15eb-4135-bfb6-c42df60defa6-kube-api-access-kks4q\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110862 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110923 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.110978 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.111520 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-config\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.111715 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.112521 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.112584 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.113489 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.113703 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33fd450-15eb-4135-bfb6-c42df60defa6-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.131832 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kks4q\" (UniqueName: \"kubernetes.io/projected/e33fd450-15eb-4135-bfb6-c42df60defa6-kube-api-access-kks4q\") pod \"dnsmasq-dns-8c6f6df99-4jnvs\" (UID: \"e33fd450-15eb-4135-bfb6-c42df60defa6\") " pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.355870 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.375587 4891 generic.go:334] "Generic (PLEG): container finished" podID="013367be-7a48-44db-839f-145a78a17cc1" containerID="8c176e2a9a45eadf839b665216c6b347cfab4bba431f2a16029c3d64ad1d0ece" exitCode=0 Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.375640 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" event={"ID":"013367be-7a48-44db-839f-145a78a17cc1","Type":"ContainerDied","Data":"8c176e2a9a45eadf839b665216c6b347cfab4bba431f2a16029c3d64ad1d0ece"} Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.600114 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624053 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624208 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624246 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46fjj\" (UniqueName: \"kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624481 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624531 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.624606 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb\") pod \"013367be-7a48-44db-839f-145a78a17cc1\" (UID: \"013367be-7a48-44db-839f-145a78a17cc1\") " Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.630930 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj" (OuterVolumeSpecName: "kube-api-access-46fjj") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "kube-api-access-46fjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.694808 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.698847 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config" (OuterVolumeSpecName: "config") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.700747 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.705232 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.717668 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "013367be-7a48-44db-839f-145a78a17cc1" (UID: "013367be-7a48-44db-839f-145a78a17cc1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727850 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727894 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727907 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727915 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727924 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013367be-7a48-44db-839f-145a78a17cc1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.727933 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46fjj\" (UniqueName: \"kubernetes.io/projected/013367be-7a48-44db-839f-145a78a17cc1-kube-api-access-46fjj\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:02 crc kubenswrapper[4891]: I0929 10:10:02.852309 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-4jnvs"] Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.385356 4891 generic.go:334] "Generic (PLEG): container finished" podID="e33fd450-15eb-4135-bfb6-c42df60defa6" containerID="7d13f1a3ee4d99f6517c47f3e9ffb0b8fd135b613e8eaa5f96a519a2b9433b4a" exitCode=0 Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.385532 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" event={"ID":"e33fd450-15eb-4135-bfb6-c42df60defa6","Type":"ContainerDied","Data":"7d13f1a3ee4d99f6517c47f3e9ffb0b8fd135b613e8eaa5f96a519a2b9433b4a"} Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.385834 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" event={"ID":"e33fd450-15eb-4135-bfb6-c42df60defa6","Type":"ContainerStarted","Data":"13cd664c6af4f2cbee570ee0fcb2948de4f628a5c918660834d84e7a15586c0b"} Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.388298 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" event={"ID":"013367be-7a48-44db-839f-145a78a17cc1","Type":"ContainerDied","Data":"884cf13e5a754618beead72f8d3481e8c981e68f6e9ffc4c533ff5031915af2b"} Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.388346 4891 scope.go:117] "RemoveContainer" containerID="8c176e2a9a45eadf839b665216c6b347cfab4bba431f2a16029c3d64ad1d0ece" Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.388531 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-2l6lc" Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.559499 4891 scope.go:117] "RemoveContainer" containerID="bcbe6effabfe684fc3e2cd64c877af4b60f06ea8d75c6e31f19c8a393874e149" Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.603597 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:10:03 crc kubenswrapper[4891]: I0929 10:10:03.615132 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-2l6lc"] Sep 29 10:10:04 crc kubenswrapper[4891]: I0929 10:10:04.409513 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013367be-7a48-44db-839f-145a78a17cc1" path="/var/lib/kubelet/pods/013367be-7a48-44db-839f-145a78a17cc1/volumes" Sep 29 10:10:04 crc kubenswrapper[4891]: I0929 10:10:04.410640 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:04 crc kubenswrapper[4891]: I0929 10:10:04.410680 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" event={"ID":"e33fd450-15eb-4135-bfb6-c42df60defa6","Type":"ContainerStarted","Data":"4db0723d840f3d2e87845e7a8ee73a500e5e4b75ff67623f0150a0c19ce5fd8d"} Sep 29 10:10:04 crc kubenswrapper[4891]: I0929 10:10:04.429401 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" podStartSLOduration=3.429367723 podStartE2EDuration="3.429367723s" podCreationTimestamp="2025-09-29 10:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:10:04.427893919 +0000 UTC m=+1334.633062270" watchObservedRunningTime="2025-09-29 10:10:04.429367723 +0000 UTC m=+1334.634536044" Sep 29 10:10:06 crc kubenswrapper[4891]: I0929 10:10:06.185882 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:10:06 crc kubenswrapper[4891]: I0929 10:10:06.186302 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:10:12 crc kubenswrapper[4891]: I0929 10:10:12.357994 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-4jnvs" Sep 29 10:10:12 crc kubenswrapper[4891]: I0929 10:10:12.447977 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:10:12 crc kubenswrapper[4891]: I0929 10:10:12.448281 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="dnsmasq-dns" containerID="cri-o://770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38" gracePeriod=10 Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.113891 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.166013 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.166557 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.166728 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.166919 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.167114 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.167219 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.167329 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpccf\" (UniqueName: \"kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf\") pod \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\" (UID: \"cce2ae2a-b69d-4dd6-82e8-63ea107c5817\") " Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.185349 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf" (OuterVolumeSpecName: "kube-api-access-mpccf") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "kube-api-access-mpccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.236199 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.236871 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config" (OuterVolumeSpecName: "config") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.243345 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.252661 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.254497 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.257348 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cce2ae2a-b69d-4dd6-82e8-63ea107c5817" (UID: "cce2ae2a-b69d-4dd6-82e8-63ea107c5817"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273067 4891 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273114 4891 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273129 4891 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273145 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273156 4891 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273166 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.273178 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpccf\" (UniqueName: \"kubernetes.io/projected/cce2ae2a-b69d-4dd6-82e8-63ea107c5817-kube-api-access-mpccf\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.506072 4891 generic.go:334] "Generic (PLEG): container finished" podID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerID="770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38" exitCode=0 Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.506275 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" event={"ID":"cce2ae2a-b69d-4dd6-82e8-63ea107c5817","Type":"ContainerDied","Data":"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38"} Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.506776 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" event={"ID":"cce2ae2a-b69d-4dd6-82e8-63ea107c5817","Type":"ContainerDied","Data":"02f1045ea1e5b2dcec544afa031051ac7d1d5dad9b8630b59f58b5f818fe490d"} Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.506827 4891 scope.go:117] "RemoveContainer" containerID="770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.506419 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-thjk2" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.532323 4891 scope.go:117] "RemoveContainer" containerID="66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.549562 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.560328 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-thjk2"] Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.567249 4891 scope.go:117] "RemoveContainer" containerID="770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38" Sep 29 10:10:13 crc kubenswrapper[4891]: E0929 10:10:13.567873 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38\": container with ID starting with 770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38 not found: ID does not exist" containerID="770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.567917 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38"} err="failed to get container status \"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38\": rpc error: code = NotFound desc = could not find container \"770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38\": container with ID starting with 770c66d75368dc5ba3ba1d72af88381335d10c1124d547676c2501918b1fbf38 not found: ID does not exist" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.567948 4891 scope.go:117] "RemoveContainer" containerID="66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2" Sep 29 10:10:13 crc kubenswrapper[4891]: E0929 10:10:13.568386 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2\": container with ID starting with 66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2 not found: ID does not exist" containerID="66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2" Sep 29 10:10:13 crc kubenswrapper[4891]: I0929 10:10:13.568454 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2"} err="failed to get container status \"66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2\": rpc error: code = NotFound desc = could not find container \"66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2\": container with ID starting with 66a2d8ff53a13f2c00d97b1464ab676484e50cc8695308d9b2e13b3956d77cf2 not found: ID does not exist" Sep 29 10:10:14 crc kubenswrapper[4891]: I0929 10:10:14.409107 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" path="/var/lib/kubelet/pods/cce2ae2a-b69d-4dd6-82e8-63ea107c5817/volumes" Sep 29 10:10:19 crc kubenswrapper[4891]: I0929 10:10:19.567406 4891 generic.go:334] "Generic (PLEG): container finished" podID="cdea1706-076e-4094-a87b-d79580a81fcd" containerID="d5a576949a5f4be07d7cc8ed632c4fccac079e4959920076c8abd02a2d7eb53c" exitCode=0 Sep 29 10:10:19 crc kubenswrapper[4891]: I0929 10:10:19.567521 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cdea1706-076e-4094-a87b-d79580a81fcd","Type":"ContainerDied","Data":"d5a576949a5f4be07d7cc8ed632c4fccac079e4959920076c8abd02a2d7eb53c"} Sep 29 10:10:19 crc kubenswrapper[4891]: I0929 10:10:19.569826 4891 generic.go:334] "Generic (PLEG): container finished" podID="e1fd90ef-a2cb-4931-ba68-9e302e943c2b" containerID="6512a6f54b5a8087c9c789e5807c683d9540ebcffb3ef24f03b37f605388dd0e" exitCode=0 Sep 29 10:10:19 crc kubenswrapper[4891]: I0929 10:10:19.569877 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e1fd90ef-a2cb-4931-ba68-9e302e943c2b","Type":"ContainerDied","Data":"6512a6f54b5a8087c9c789e5807c683d9540ebcffb3ef24f03b37f605388dd0e"} Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.579833 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cdea1706-076e-4094-a87b-d79580a81fcd","Type":"ContainerStarted","Data":"10595bc4afb2d86245b287eec776429799994607f8ecae0d1470b0f53ed8eb22"} Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.581461 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.584043 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e1fd90ef-a2cb-4931-ba68-9e302e943c2b","Type":"ContainerStarted","Data":"07146050f527df18c82376fe6695eef3bbc091d6e38f43b7cdb44513d5579a84"} Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.584720 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.607128 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=33.607103994 podStartE2EDuration="33.607103994s" podCreationTimestamp="2025-09-29 10:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:10:20.604116417 +0000 UTC m=+1350.809284758" watchObservedRunningTime="2025-09-29 10:10:20.607103994 +0000 UTC m=+1350.812272315" Sep 29 10:10:20 crc kubenswrapper[4891]: I0929 10:10:20.647597 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.647573822 podStartE2EDuration="33.647573822s" podCreationTimestamp="2025-09-29 10:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:10:20.640518625 +0000 UTC m=+1350.845686966" watchObservedRunningTime="2025-09-29 10:10:20.647573822 +0000 UTC m=+1350.852742143" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.334097 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d"] Sep 29 10:10:29 crc kubenswrapper[4891]: E0929 10:10:29.336334 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.336436 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: E0929 10:10:29.336543 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="init" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.336611 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="init" Sep 29 10:10:29 crc kubenswrapper[4891]: E0929 10:10:29.336686 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.336748 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: E0929 10:10:29.336853 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="init" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.336931 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="init" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.337250 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce2ae2a-b69d-4dd6-82e8-63ea107c5817" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.337353 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="013367be-7a48-44db-839f-145a78a17cc1" containerName="dnsmasq-dns" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.338264 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.341231 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.341284 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.345148 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.345483 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.351214 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d"] Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.450895 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wr6r\" (UniqueName: \"kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.451118 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.451294 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.451359 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.553690 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wr6r\" (UniqueName: \"kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.554396 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.554613 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.554748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.561724 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.562006 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.562559 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.575181 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wr6r\" (UniqueName: \"kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:29 crc kubenswrapper[4891]: I0929 10:10:29.708781 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:30 crc kubenswrapper[4891]: I0929 10:10:30.315923 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d"] Sep 29 10:10:30 crc kubenswrapper[4891]: I0929 10:10:30.699686 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" event={"ID":"6398fd57-3f6f-4c01-98da-81f0ad16c4a6","Type":"ContainerStarted","Data":"b610691b4f5a6bc6659760dfc6c89b1d1877584d2168f4294d9d814ffaa213f6"} Sep 29 10:10:36 crc kubenswrapper[4891]: I0929 10:10:36.185778 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:10:36 crc kubenswrapper[4891]: I0929 10:10:36.186220 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:10:37 crc kubenswrapper[4891]: I0929 10:10:37.573063 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:10:37 crc kubenswrapper[4891]: I0929 10:10:37.599014 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 10:10:39 crc kubenswrapper[4891]: I0929 10:10:39.804981 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" event={"ID":"6398fd57-3f6f-4c01-98da-81f0ad16c4a6","Type":"ContainerStarted","Data":"5cd8158fadf7b7a500426775d552ed17a38d90fef3bec0c66665290cc48820ac"} Sep 29 10:10:39 crc kubenswrapper[4891]: I0929 10:10:39.839573 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" podStartSLOduration=1.895910486 podStartE2EDuration="10.839541774s" podCreationTimestamp="2025-09-29 10:10:29 +0000 UTC" firstStartedPulling="2025-09-29 10:10:30.316326568 +0000 UTC m=+1360.521494889" lastFinishedPulling="2025-09-29 10:10:39.259957836 +0000 UTC m=+1369.465126177" observedRunningTime="2025-09-29 10:10:39.824553974 +0000 UTC m=+1370.029722295" watchObservedRunningTime="2025-09-29 10:10:39.839541774 +0000 UTC m=+1370.044710135" Sep 29 10:10:51 crc kubenswrapper[4891]: I0929 10:10:51.925342 4891 generic.go:334] "Generic (PLEG): container finished" podID="6398fd57-3f6f-4c01-98da-81f0ad16c4a6" containerID="5cd8158fadf7b7a500426775d552ed17a38d90fef3bec0c66665290cc48820ac" exitCode=0 Sep 29 10:10:51 crc kubenswrapper[4891]: I0929 10:10:51.925452 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" event={"ID":"6398fd57-3f6f-4c01-98da-81f0ad16c4a6","Type":"ContainerDied","Data":"5cd8158fadf7b7a500426775d552ed17a38d90fef3bec0c66665290cc48820ac"} Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.374258 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.386185 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wr6r\" (UniqueName: \"kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r\") pod \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.386358 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory\") pod \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.386497 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle\") pod \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.386545 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key\") pod \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\" (UID: \"6398fd57-3f6f-4c01-98da-81f0ad16c4a6\") " Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.395255 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6398fd57-3f6f-4c01-98da-81f0ad16c4a6" (UID: "6398fd57-3f6f-4c01-98da-81f0ad16c4a6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.400324 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r" (OuterVolumeSpecName: "kube-api-access-6wr6r") pod "6398fd57-3f6f-4c01-98da-81f0ad16c4a6" (UID: "6398fd57-3f6f-4c01-98da-81f0ad16c4a6"). InnerVolumeSpecName "kube-api-access-6wr6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.456041 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory" (OuterVolumeSpecName: "inventory") pod "6398fd57-3f6f-4c01-98da-81f0ad16c4a6" (UID: "6398fd57-3f6f-4c01-98da-81f0ad16c4a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.464094 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6398fd57-3f6f-4c01-98da-81f0ad16c4a6" (UID: "6398fd57-3f6f-4c01-98da-81f0ad16c4a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.489511 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.489552 4891 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.489563 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.489572 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wr6r\" (UniqueName: \"kubernetes.io/projected/6398fd57-3f6f-4c01-98da-81f0ad16c4a6-kube-api-access-6wr6r\") on node \"crc\" DevicePath \"\"" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.950217 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" event={"ID":"6398fd57-3f6f-4c01-98da-81f0ad16c4a6","Type":"ContainerDied","Data":"b610691b4f5a6bc6659760dfc6c89b1d1877584d2168f4294d9d814ffaa213f6"} Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.950273 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b610691b4f5a6bc6659760dfc6c89b1d1877584d2168f4294d9d814ffaa213f6" Sep 29 10:10:53 crc kubenswrapper[4891]: I0929 10:10:53.950323 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.023785 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns"] Sep 29 10:10:54 crc kubenswrapper[4891]: E0929 10:10:54.024375 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6398fd57-3f6f-4c01-98da-81f0ad16c4a6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.024399 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="6398fd57-3f6f-4c01-98da-81f0ad16c4a6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.024677 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="6398fd57-3f6f-4c01-98da-81f0ad16c4a6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.025570 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.029166 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.029384 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.029643 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.030143 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.053241 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns"] Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.205217 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.205458 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.205652 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7z54\" (UniqueName: \"kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.307942 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.308030 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7z54\" (UniqueName: \"kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.308133 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.313814 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.315507 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.329692 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7z54\" (UniqueName: \"kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6t8ns\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.347775 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.924246 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns"] Sep 29 10:10:54 crc kubenswrapper[4891]: I0929 10:10:54.969565 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" event={"ID":"d51c52c6-2e99-465a-9654-c58b12dd213e","Type":"ContainerStarted","Data":"b33dea52235500f1755bf21da0472de7f2974429361395d50d79c9765e0ac12a"} Sep 29 10:10:55 crc kubenswrapper[4891]: I0929 10:10:55.983735 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" event={"ID":"d51c52c6-2e99-465a-9654-c58b12dd213e","Type":"ContainerStarted","Data":"724e1dc0a6877c6853cfb015f60187d2ac179d6cebe281f700d4e97f17f2b290"} Sep 29 10:10:56 crc kubenswrapper[4891]: I0929 10:10:56.002884 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" podStartSLOduration=1.393072568 podStartE2EDuration="2.002858332s" podCreationTimestamp="2025-09-29 10:10:54 +0000 UTC" firstStartedPulling="2025-09-29 10:10:54.930150003 +0000 UTC m=+1385.135318324" lastFinishedPulling="2025-09-29 10:10:55.539935767 +0000 UTC m=+1385.745104088" observedRunningTime="2025-09-29 10:10:55.999091101 +0000 UTC m=+1386.204259432" watchObservedRunningTime="2025-09-29 10:10:56.002858332 +0000 UTC m=+1386.208026683" Sep 29 10:10:59 crc kubenswrapper[4891]: I0929 10:10:59.019617 4891 generic.go:334] "Generic (PLEG): container finished" podID="d51c52c6-2e99-465a-9654-c58b12dd213e" containerID="724e1dc0a6877c6853cfb015f60187d2ac179d6cebe281f700d4e97f17f2b290" exitCode=0 Sep 29 10:10:59 crc kubenswrapper[4891]: I0929 10:10:59.019724 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" event={"ID":"d51c52c6-2e99-465a-9654-c58b12dd213e","Type":"ContainerDied","Data":"724e1dc0a6877c6853cfb015f60187d2ac179d6cebe281f700d4e97f17f2b290"} Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.479590 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.646137 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7z54\" (UniqueName: \"kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54\") pod \"d51c52c6-2e99-465a-9654-c58b12dd213e\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.646966 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key\") pod \"d51c52c6-2e99-465a-9654-c58b12dd213e\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.647018 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory\") pod \"d51c52c6-2e99-465a-9654-c58b12dd213e\" (UID: \"d51c52c6-2e99-465a-9654-c58b12dd213e\") " Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.652490 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54" (OuterVolumeSpecName: "kube-api-access-r7z54") pod "d51c52c6-2e99-465a-9654-c58b12dd213e" (UID: "d51c52c6-2e99-465a-9654-c58b12dd213e"). InnerVolumeSpecName "kube-api-access-r7z54". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.682089 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory" (OuterVolumeSpecName: "inventory") pod "d51c52c6-2e99-465a-9654-c58b12dd213e" (UID: "d51c52c6-2e99-465a-9654-c58b12dd213e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.689341 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d51c52c6-2e99-465a-9654-c58b12dd213e" (UID: "d51c52c6-2e99-465a-9654-c58b12dd213e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.749408 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.749462 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d51c52c6-2e99-465a-9654-c58b12dd213e-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:00 crc kubenswrapper[4891]: I0929 10:11:00.749483 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7z54\" (UniqueName: \"kubernetes.io/projected/d51c52c6-2e99-465a-9654-c58b12dd213e-kube-api-access-r7z54\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.050754 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" event={"ID":"d51c52c6-2e99-465a-9654-c58b12dd213e","Type":"ContainerDied","Data":"b33dea52235500f1755bf21da0472de7f2974429361395d50d79c9765e0ac12a"} Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.050808 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33dea52235500f1755bf21da0472de7f2974429361395d50d79c9765e0ac12a" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.050867 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6t8ns" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.146280 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4"] Sep 29 10:11:01 crc kubenswrapper[4891]: E0929 10:11:01.146812 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51c52c6-2e99-465a-9654-c58b12dd213e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.146834 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51c52c6-2e99-465a-9654-c58b12dd213e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.147083 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51c52c6-2e99-465a-9654-c58b12dd213e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.148540 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.155423 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.155935 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.156847 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.156978 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.157001 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vrld\" (UniqueName: \"kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.157029 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.157223 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4"] Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.157246 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.157332 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.259130 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.259172 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vrld\" (UniqueName: \"kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.259206 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.259236 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.262993 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.263668 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.263783 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.275668 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vrld\" (UniqueName: \"kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:01 crc kubenswrapper[4891]: I0929 10:11:01.477176 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:11:02 crc kubenswrapper[4891]: I0929 10:11:02.067289 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4"] Sep 29 10:11:03 crc kubenswrapper[4891]: I0929 10:11:03.073079 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" event={"ID":"9c677d7a-2716-4c8d-8d87-7c158ca5de6c","Type":"ContainerStarted","Data":"e24a2b7196159a8f823ea2b651e31c8d26bc138fadb0210c69beb18215f0faaf"} Sep 29 10:11:04 crc kubenswrapper[4891]: I0929 10:11:04.090104 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" event={"ID":"9c677d7a-2716-4c8d-8d87-7c158ca5de6c","Type":"ContainerStarted","Data":"d14bc2d07e746ea7c36e2c1c841de07292df919c0b90aa53f0c9ac0eb3ce4a5f"} Sep 29 10:11:04 crc kubenswrapper[4891]: I0929 10:11:04.121265 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" podStartSLOduration=2.186861934 podStartE2EDuration="3.121239603s" podCreationTimestamp="2025-09-29 10:11:01 +0000 UTC" firstStartedPulling="2025-09-29 10:11:02.074685738 +0000 UTC m=+1392.279854099" lastFinishedPulling="2025-09-29 10:11:03.009063447 +0000 UTC m=+1393.214231768" observedRunningTime="2025-09-29 10:11:04.112759954 +0000 UTC m=+1394.317928305" watchObservedRunningTime="2025-09-29 10:11:04.121239603 +0000 UTC m=+1394.326407944" Sep 29 10:11:06 crc kubenswrapper[4891]: I0929 10:11:06.185993 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:11:06 crc kubenswrapper[4891]: I0929 10:11:06.186440 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:11:06 crc kubenswrapper[4891]: I0929 10:11:06.186509 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:11:06 crc kubenswrapper[4891]: I0929 10:11:06.187285 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:11:06 crc kubenswrapper[4891]: I0929 10:11:06.187385 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960" gracePeriod=600 Sep 29 10:11:07 crc kubenswrapper[4891]: I0929 10:11:07.124556 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960" exitCode=0 Sep 29 10:11:07 crc kubenswrapper[4891]: I0929 10:11:07.124636 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960"} Sep 29 10:11:07 crc kubenswrapper[4891]: I0929 10:11:07.125003 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5"} Sep 29 10:11:07 crc kubenswrapper[4891]: I0929 10:11:07.125076 4891 scope.go:117] "RemoveContainer" containerID="7bbfbfc1e771a86a3e36e83972d117fbe73f0bd186aec5042fa0d90572ce1a07" Sep 29 10:11:14 crc kubenswrapper[4891]: I0929 10:11:14.202379 4891 scope.go:117] "RemoveContainer" containerID="4aa39b8e2d1af6216ca011bd793c036285d180bacb8213ef333c981ae198cc94" Sep 29 10:12:14 crc kubenswrapper[4891]: I0929 10:12:14.277090 4891 scope.go:117] "RemoveContainer" containerID="94733727fc21851a7beb9ab47c467ae767e5719fb96d08dd9b66466a29d4c5d5" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.585724 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.591643 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.603928 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.703326 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ccv\" (UniqueName: \"kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.703555 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.703602 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.805632 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.805710 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.805748 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ccv\" (UniqueName: \"kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.806374 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.806459 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.830171 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ccv\" (UniqueName: \"kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv\") pod \"redhat-marketplace-v2tt2\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:28 crc kubenswrapper[4891]: I0929 10:12:28.942574 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:29 crc kubenswrapper[4891]: I0929 10:12:29.407640 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:30 crc kubenswrapper[4891]: I0929 10:12:30.048831 4891 generic.go:334] "Generic (PLEG): container finished" podID="b442cf64-c005-4418-ae79-3cbf159fe816" containerID="902f437f1c3467ecdd9b17e508226e9b9ccfdd684d534e9b49ff45041191c996" exitCode=0 Sep 29 10:12:30 crc kubenswrapper[4891]: I0929 10:12:30.048972 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerDied","Data":"902f437f1c3467ecdd9b17e508226e9b9ccfdd684d534e9b49ff45041191c996"} Sep 29 10:12:30 crc kubenswrapper[4891]: I0929 10:12:30.049133 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerStarted","Data":"0a72c688029443fbe756392bbf2fd79b6cf73588934baaf8f08c6af7e193f05f"} Sep 29 10:12:32 crc kubenswrapper[4891]: I0929 10:12:32.072154 4891 generic.go:334] "Generic (PLEG): container finished" podID="b442cf64-c005-4418-ae79-3cbf159fe816" containerID="0091d92904d5c575931d5c7a4497d0124a50d209ebb03964e8c79259b6c0ec49" exitCode=0 Sep 29 10:12:32 crc kubenswrapper[4891]: I0929 10:12:32.072284 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerDied","Data":"0091d92904d5c575931d5c7a4497d0124a50d209ebb03964e8c79259b6c0ec49"} Sep 29 10:12:33 crc kubenswrapper[4891]: I0929 10:12:33.086610 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerStarted","Data":"6dff69b466a0dfa2b38ec4fa4a0142436ad8b187c999f3a4e63aa3b8672a3b79"} Sep 29 10:12:38 crc kubenswrapper[4891]: I0929 10:12:38.943325 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:38 crc kubenswrapper[4891]: I0929 10:12:38.944821 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:38 crc kubenswrapper[4891]: I0929 10:12:38.991585 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:39 crc kubenswrapper[4891]: I0929 10:12:39.011680 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v2tt2" podStartSLOduration=8.599528476 podStartE2EDuration="11.011660948s" podCreationTimestamp="2025-09-29 10:12:28 +0000 UTC" firstStartedPulling="2025-09-29 10:12:30.050731044 +0000 UTC m=+1480.255899365" lastFinishedPulling="2025-09-29 10:12:32.462863506 +0000 UTC m=+1482.668031837" observedRunningTime="2025-09-29 10:12:33.108692867 +0000 UTC m=+1483.313861208" watchObservedRunningTime="2025-09-29 10:12:39.011660948 +0000 UTC m=+1489.216829279" Sep 29 10:12:39 crc kubenswrapper[4891]: I0929 10:12:39.193435 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:39 crc kubenswrapper[4891]: I0929 10:12:39.253724 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.171747 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v2tt2" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="registry-server" containerID="cri-o://6dff69b466a0dfa2b38ec4fa4a0142436ad8b187c999f3a4e63aa3b8672a3b79" gracePeriod=2 Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.661338 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.665280 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.674403 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.679225 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg526\" (UniqueName: \"kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.679553 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.679826 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.782920 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.783101 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.783181 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg526\" (UniqueName: \"kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.784172 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.784475 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:41 crc kubenswrapper[4891]: I0929 10:12:41.814227 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg526\" (UniqueName: \"kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526\") pod \"community-operators-n7fvn\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.004744 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.224121 4891 generic.go:334] "Generic (PLEG): container finished" podID="b442cf64-c005-4418-ae79-3cbf159fe816" containerID="6dff69b466a0dfa2b38ec4fa4a0142436ad8b187c999f3a4e63aa3b8672a3b79" exitCode=0 Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.225408 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerDied","Data":"6dff69b466a0dfa2b38ec4fa4a0142436ad8b187c999f3a4e63aa3b8672a3b79"} Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.284493 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.303845 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content\") pod \"b442cf64-c005-4418-ae79-3cbf159fe816\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.303898 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6ccv\" (UniqueName: \"kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv\") pod \"b442cf64-c005-4418-ae79-3cbf159fe816\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.303927 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities\") pod \"b442cf64-c005-4418-ae79-3cbf159fe816\" (UID: \"b442cf64-c005-4418-ae79-3cbf159fe816\") " Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.306499 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities" (OuterVolumeSpecName: "utilities") pod "b442cf64-c005-4418-ae79-3cbf159fe816" (UID: "b442cf64-c005-4418-ae79-3cbf159fe816"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.324604 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv" (OuterVolumeSpecName: "kube-api-access-b6ccv") pod "b442cf64-c005-4418-ae79-3cbf159fe816" (UID: "b442cf64-c005-4418-ae79-3cbf159fe816"). InnerVolumeSpecName "kube-api-access-b6ccv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.340658 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b442cf64-c005-4418-ae79-3cbf159fe816" (UID: "b442cf64-c005-4418-ae79-3cbf159fe816"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.407006 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.407040 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6ccv\" (UniqueName: \"kubernetes.io/projected/b442cf64-c005-4418-ae79-3cbf159fe816-kube-api-access-b6ccv\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.407056 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b442cf64-c005-4418-ae79-3cbf159fe816-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:42 crc kubenswrapper[4891]: I0929 10:12:42.633493 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.240915 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2tt2" event={"ID":"b442cf64-c005-4418-ae79-3cbf159fe816","Type":"ContainerDied","Data":"0a72c688029443fbe756392bbf2fd79b6cf73588934baaf8f08c6af7e193f05f"} Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.241316 4891 scope.go:117] "RemoveContainer" containerID="6dff69b466a0dfa2b38ec4fa4a0142436ad8b187c999f3a4e63aa3b8672a3b79" Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.240940 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2tt2" Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.244954 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerID="6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca" exitCode=0 Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.245027 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerDied","Data":"6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca"} Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.245095 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerStarted","Data":"c451a21addab98b791e7a7e993dc6aeaf6e6fb6c1dd40b3fa7514c749d3a8fef"} Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.267573 4891 scope.go:117] "RemoveContainer" containerID="0091d92904d5c575931d5c7a4497d0124a50d209ebb03964e8c79259b6c0ec49" Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.303832 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.313389 4891 scope.go:117] "RemoveContainer" containerID="902f437f1c3467ecdd9b17e508226e9b9ccfdd684d534e9b49ff45041191c996" Sep 29 10:12:43 crc kubenswrapper[4891]: I0929 10:12:43.316138 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2tt2"] Sep 29 10:12:44 crc kubenswrapper[4891]: I0929 10:12:44.258938 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerStarted","Data":"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e"} Sep 29 10:12:44 crc kubenswrapper[4891]: I0929 10:12:44.411283 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" path="/var/lib/kubelet/pods/b442cf64-c005-4418-ae79-3cbf159fe816/volumes" Sep 29 10:12:45 crc kubenswrapper[4891]: I0929 10:12:45.277935 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerID="43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e" exitCode=0 Sep 29 10:12:45 crc kubenswrapper[4891]: I0929 10:12:45.278032 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerDied","Data":"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e"} Sep 29 10:12:47 crc kubenswrapper[4891]: I0929 10:12:47.308214 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerStarted","Data":"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5"} Sep 29 10:12:47 crc kubenswrapper[4891]: I0929 10:12:47.336779 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7fvn" podStartSLOduration=3.498921928 podStartE2EDuration="6.336752486s" podCreationTimestamp="2025-09-29 10:12:41 +0000 UTC" firstStartedPulling="2025-09-29 10:12:43.247467307 +0000 UTC m=+1493.452635668" lastFinishedPulling="2025-09-29 10:12:46.085297895 +0000 UTC m=+1496.290466226" observedRunningTime="2025-09-29 10:12:47.334605664 +0000 UTC m=+1497.539773985" watchObservedRunningTime="2025-09-29 10:12:47.336752486 +0000 UTC m=+1497.541920807" Sep 29 10:12:52 crc kubenswrapper[4891]: I0929 10:12:52.005433 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:52 crc kubenswrapper[4891]: I0929 10:12:52.006071 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:52 crc kubenswrapper[4891]: I0929 10:12:52.057184 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:52 crc kubenswrapper[4891]: I0929 10:12:52.442118 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:52 crc kubenswrapper[4891]: I0929 10:12:52.500161 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.386131 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7fvn" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="registry-server" containerID="cri-o://42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5" gracePeriod=2 Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.900759 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.932446 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg526\" (UniqueName: \"kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526\") pod \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.932736 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content\") pod \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.932834 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities\") pod \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\" (UID: \"f4422f82-485a-4a91-9ff9-9f050a6f66b1\") " Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.933666 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities" (OuterVolumeSpecName: "utilities") pod "f4422f82-485a-4a91-9ff9-9f050a6f66b1" (UID: "f4422f82-485a-4a91-9ff9-9f050a6f66b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.962236 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526" (OuterVolumeSpecName: "kube-api-access-rg526") pod "f4422f82-485a-4a91-9ff9-9f050a6f66b1" (UID: "f4422f82-485a-4a91-9ff9-9f050a6f66b1"). InnerVolumeSpecName "kube-api-access-rg526". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:12:54 crc kubenswrapper[4891]: I0929 10:12:54.988744 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4422f82-485a-4a91-9ff9-9f050a6f66b1" (UID: "f4422f82-485a-4a91-9ff9-9f050a6f66b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.035192 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.035244 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4422f82-485a-4a91-9ff9-9f050a6f66b1-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.035257 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg526\" (UniqueName: \"kubernetes.io/projected/f4422f82-485a-4a91-9ff9-9f050a6f66b1-kube-api-access-rg526\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.403855 4891 generic.go:334] "Generic (PLEG): container finished" podID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerID="42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5" exitCode=0 Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.403969 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7fvn" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.404055 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerDied","Data":"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5"} Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.404419 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7fvn" event={"ID":"f4422f82-485a-4a91-9ff9-9f050a6f66b1","Type":"ContainerDied","Data":"c451a21addab98b791e7a7e993dc6aeaf6e6fb6c1dd40b3fa7514c749d3a8fef"} Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.404458 4891 scope.go:117] "RemoveContainer" containerID="42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.440524 4891 scope.go:117] "RemoveContainer" containerID="43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.446654 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.458745 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7fvn"] Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.461560 4891 scope.go:117] "RemoveContainer" containerID="6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.517024 4891 scope.go:117] "RemoveContainer" containerID="42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5" Sep 29 10:12:55 crc kubenswrapper[4891]: E0929 10:12:55.517523 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5\": container with ID starting with 42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5 not found: ID does not exist" containerID="42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.517556 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5"} err="failed to get container status \"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5\": rpc error: code = NotFound desc = could not find container \"42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5\": container with ID starting with 42c7f5200c9d1a1c5a461a463153b63abca20fee7da80c1581a584209f514ca5 not found: ID does not exist" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.517578 4891 scope.go:117] "RemoveContainer" containerID="43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e" Sep 29 10:12:55 crc kubenswrapper[4891]: E0929 10:12:55.517902 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e\": container with ID starting with 43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e not found: ID does not exist" containerID="43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.517957 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e"} err="failed to get container status \"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e\": rpc error: code = NotFound desc = could not find container \"43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e\": container with ID starting with 43cf1076a99be62fe7ca3a09df7674079c2e39a40c2725ea95253976dcc0fc3e not found: ID does not exist" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.517994 4891 scope.go:117] "RemoveContainer" containerID="6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca" Sep 29 10:12:55 crc kubenswrapper[4891]: E0929 10:12:55.518324 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca\": container with ID starting with 6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca not found: ID does not exist" containerID="6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca" Sep 29 10:12:55 crc kubenswrapper[4891]: I0929 10:12:55.518371 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca"} err="failed to get container status \"6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca\": rpc error: code = NotFound desc = could not find container \"6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca\": container with ID starting with 6c424c7fa79cb32b4a005ab870349b586bc5926e2782f6efd4ffb14a6de18aca not found: ID does not exist" Sep 29 10:12:56 crc kubenswrapper[4891]: I0929 10:12:56.414223 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" path="/var/lib/kubelet/pods/f4422f82-485a-4a91-9ff9-9f050a6f66b1/volumes" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.807575 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.808764 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="extract-content" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.808795 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="extract-content" Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.808846 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="extract-utilities" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.808858 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="extract-utilities" Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.808878 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="extract-utilities" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.808889 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="extract-utilities" Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.808915 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="extract-content" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.808926 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="extract-content" Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.808959 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.808970 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: E0929 10:12:58.809004 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.809015 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.809335 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="b442cf64-c005-4418-ae79-3cbf159fe816" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.809363 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4422f82-485a-4a91-9ff9-9f050a6f66b1" containerName="registry-server" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.811715 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.833255 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.912974 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.913066 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:58 crc kubenswrapper[4891]: I0929 10:12:58.913520 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxr4p\" (UniqueName: \"kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.015704 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxr4p\" (UniqueName: \"kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.015843 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.015889 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.016647 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.016665 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.038277 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxr4p\" (UniqueName: \"kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p\") pod \"certified-operators-tl2pn\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.157334 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:12:59 crc kubenswrapper[4891]: I0929 10:12:59.715310 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:13:00 crc kubenswrapper[4891]: I0929 10:13:00.468913 4891 generic.go:334] "Generic (PLEG): container finished" podID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerID="d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab" exitCode=0 Sep 29 10:13:00 crc kubenswrapper[4891]: I0929 10:13:00.468994 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerDied","Data":"d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab"} Sep 29 10:13:00 crc kubenswrapper[4891]: I0929 10:13:00.469239 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerStarted","Data":"f9400276ba3ed4332889231f99a8366a42f2998b4e9e244ca6fa9e4187489035"} Sep 29 10:13:02 crc kubenswrapper[4891]: I0929 10:13:02.492698 4891 generic.go:334] "Generic (PLEG): container finished" podID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerID="fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa" exitCode=0 Sep 29 10:13:02 crc kubenswrapper[4891]: I0929 10:13:02.492896 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerDied","Data":"fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa"} Sep 29 10:13:03 crc kubenswrapper[4891]: I0929 10:13:03.513722 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerStarted","Data":"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea"} Sep 29 10:13:04 crc kubenswrapper[4891]: I0929 10:13:04.560943 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tl2pn" podStartSLOduration=3.859795248 podStartE2EDuration="6.560923591s" podCreationTimestamp="2025-09-29 10:12:58 +0000 UTC" firstStartedPulling="2025-09-29 10:13:00.47159391 +0000 UTC m=+1510.676762241" lastFinishedPulling="2025-09-29 10:13:03.172722263 +0000 UTC m=+1513.377890584" observedRunningTime="2025-09-29 10:13:04.556192424 +0000 UTC m=+1514.761360765" watchObservedRunningTime="2025-09-29 10:13:04.560923591 +0000 UTC m=+1514.766091912" Sep 29 10:13:09 crc kubenswrapper[4891]: I0929 10:13:09.157580 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:09 crc kubenswrapper[4891]: I0929 10:13:09.158117 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:09 crc kubenswrapper[4891]: I0929 10:13:09.224535 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:09 crc kubenswrapper[4891]: I0929 10:13:09.652859 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:09 crc kubenswrapper[4891]: I0929 10:13:09.725753 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:13:11 crc kubenswrapper[4891]: I0929 10:13:11.603852 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tl2pn" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="registry-server" containerID="cri-o://c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea" gracePeriod=2 Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.012895 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.195925 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities\") pod \"412b64da-3862-4b4d-9afd-2cde2e423d74\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.196229 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxr4p\" (UniqueName: \"kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p\") pod \"412b64da-3862-4b4d-9afd-2cde2e423d74\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.196275 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content\") pod \"412b64da-3862-4b4d-9afd-2cde2e423d74\" (UID: \"412b64da-3862-4b4d-9afd-2cde2e423d74\") " Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.197451 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities" (OuterVolumeSpecName: "utilities") pod "412b64da-3862-4b4d-9afd-2cde2e423d74" (UID: "412b64da-3862-4b4d-9afd-2cde2e423d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.203004 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p" (OuterVolumeSpecName: "kube-api-access-mxr4p") pod "412b64da-3862-4b4d-9afd-2cde2e423d74" (UID: "412b64da-3862-4b4d-9afd-2cde2e423d74"). InnerVolumeSpecName "kube-api-access-mxr4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.249933 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "412b64da-3862-4b4d-9afd-2cde2e423d74" (UID: "412b64da-3862-4b4d-9afd-2cde2e423d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.299017 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxr4p\" (UniqueName: \"kubernetes.io/projected/412b64da-3862-4b4d-9afd-2cde2e423d74-kube-api-access-mxr4p\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.299117 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.299176 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/412b64da-3862-4b4d-9afd-2cde2e423d74-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.616882 4891 generic.go:334] "Generic (PLEG): container finished" podID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerID="c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea" exitCode=0 Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.616930 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerDied","Data":"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea"} Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.616958 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tl2pn" event={"ID":"412b64da-3862-4b4d-9afd-2cde2e423d74","Type":"ContainerDied","Data":"f9400276ba3ed4332889231f99a8366a42f2998b4e9e244ca6fa9e4187489035"} Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.616977 4891 scope.go:117] "RemoveContainer" containerID="c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.616975 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tl2pn" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.643980 4891 scope.go:117] "RemoveContainer" containerID="fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.653300 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.661825 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tl2pn"] Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.670937 4891 scope.go:117] "RemoveContainer" containerID="d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.711060 4891 scope.go:117] "RemoveContainer" containerID="c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea" Sep 29 10:13:12 crc kubenswrapper[4891]: E0929 10:13:12.712189 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea\": container with ID starting with c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea not found: ID does not exist" containerID="c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.712265 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea"} err="failed to get container status \"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea\": rpc error: code = NotFound desc = could not find container \"c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea\": container with ID starting with c541cd2ee848ab5f250b699dfc5fb9365195de3bebc86246ee48b3b2a58706ea not found: ID does not exist" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.712315 4891 scope.go:117] "RemoveContainer" containerID="fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa" Sep 29 10:13:12 crc kubenswrapper[4891]: E0929 10:13:12.712807 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa\": container with ID starting with fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa not found: ID does not exist" containerID="fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.712847 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa"} err="failed to get container status \"fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa\": rpc error: code = NotFound desc = could not find container \"fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa\": container with ID starting with fa9fc665198c2e76f9b25747cbe6ba4f28ba29fee0d4c62498fdd400330b40aa not found: ID does not exist" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.712868 4891 scope.go:117] "RemoveContainer" containerID="d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab" Sep 29 10:13:12 crc kubenswrapper[4891]: E0929 10:13:12.713184 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab\": container with ID starting with d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab not found: ID does not exist" containerID="d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab" Sep 29 10:13:12 crc kubenswrapper[4891]: I0929 10:13:12.713212 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab"} err="failed to get container status \"d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab\": rpc error: code = NotFound desc = could not find container \"d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab\": container with ID starting with d05af839b48f5f95ea9e00082889ef5a6cc6156f2275f8a687762a711c3348ab not found: ID does not exist" Sep 29 10:13:14 crc kubenswrapper[4891]: I0929 10:13:14.369170 4891 scope.go:117] "RemoveContainer" containerID="63e2a990c20ee68d053fc2c1885fb168abc6d7c01bd5a9963201d6591925f826" Sep 29 10:13:14 crc kubenswrapper[4891]: I0929 10:13:14.393247 4891 scope.go:117] "RemoveContainer" containerID="bec3be5d6daa93371faef7bcfba0a7302c998e9bf2c0f4eccea028d31863d8dd" Sep 29 10:13:14 crc kubenswrapper[4891]: I0929 10:13:14.412704 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" path="/var/lib/kubelet/pods/412b64da-3862-4b4d-9afd-2cde2e423d74/volumes" Sep 29 10:13:36 crc kubenswrapper[4891]: I0929 10:13:36.186727 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:13:36 crc kubenswrapper[4891]: I0929 10:13:36.187367 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:14:06 crc kubenswrapper[4891]: I0929 10:14:06.185669 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:14:06 crc kubenswrapper[4891]: I0929 10:14:06.186159 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.490654 4891 scope.go:117] "RemoveContainer" containerID="339acbfb5c30a915d3004546d99cfc683ed475fc630608e8ba11344b6a5885ef" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.525724 4891 scope.go:117] "RemoveContainer" containerID="264fba4076f4d04ae67af2f111df8c937383ee0f4eb73adf020c1ac859403204" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.566193 4891 scope.go:117] "RemoveContainer" containerID="ca8c3746eb0706dbde359a9bf23b29422f3ba18bdcc369e349d97d6dc1eecf24" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.592116 4891 scope.go:117] "RemoveContainer" containerID="83d1f3977ce6fa274eb7b28120ff27e9f06ce8eefb41e86207e84443ece6f2b3" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.617072 4891 scope.go:117] "RemoveContainer" containerID="b9eaa9a629e040af97ca11582be3d9cf24446d5ece244bfc30c02eacdf476759" Sep 29 10:14:14 crc kubenswrapper[4891]: I0929 10:14:14.645979 4891 scope.go:117] "RemoveContainer" containerID="3ab23e15bdc4bf422e11da792917c1ceac6fdb7285179e467711afac5b806ec0" Sep 29 10:14:17 crc kubenswrapper[4891]: I0929 10:14:17.047054 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-tfj4j"] Sep 29 10:14:17 crc kubenswrapper[4891]: I0929 10:14:17.057880 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-tfj4j"] Sep 29 10:14:18 crc kubenswrapper[4891]: I0929 10:14:18.411027 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8429397-c754-44a8-bda3-9162297c7093" path="/var/lib/kubelet/pods/e8429397-c754-44a8-bda3-9162297c7093/volumes" Sep 29 10:14:24 crc kubenswrapper[4891]: I0929 10:14:24.414541 4891 generic.go:334] "Generic (PLEG): container finished" podID="9c677d7a-2716-4c8d-8d87-7c158ca5de6c" containerID="d14bc2d07e746ea7c36e2c1c841de07292df919c0b90aa53f0c9ac0eb3ce4a5f" exitCode=0 Sep 29 10:14:24 crc kubenswrapper[4891]: I0929 10:14:24.414637 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" event={"ID":"9c677d7a-2716-4c8d-8d87-7c158ca5de6c","Type":"ContainerDied","Data":"d14bc2d07e746ea7c36e2c1c841de07292df919c0b90aa53f0c9ac0eb3ce4a5f"} Sep 29 10:14:25 crc kubenswrapper[4891]: I0929 10:14:25.974692 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.037877 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-r9sf8"] Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.051009 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-c4dvn"] Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.061306 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-c4dvn"] Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.069487 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-r9sf8"] Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.098150 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vrld\" (UniqueName: \"kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld\") pod \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.098305 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory\") pod \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.098339 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle\") pod \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.098466 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key\") pod \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\" (UID: \"9c677d7a-2716-4c8d-8d87-7c158ca5de6c\") " Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.104350 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9c677d7a-2716-4c8d-8d87-7c158ca5de6c" (UID: "9c677d7a-2716-4c8d-8d87-7c158ca5de6c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.116916 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld" (OuterVolumeSpecName: "kube-api-access-8vrld") pod "9c677d7a-2716-4c8d-8d87-7c158ca5de6c" (UID: "9c677d7a-2716-4c8d-8d87-7c158ca5de6c"). InnerVolumeSpecName "kube-api-access-8vrld". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.132681 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c677d7a-2716-4c8d-8d87-7c158ca5de6c" (UID: "9c677d7a-2716-4c8d-8d87-7c158ca5de6c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.141865 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory" (OuterVolumeSpecName: "inventory") pod "9c677d7a-2716-4c8d-8d87-7c158ca5de6c" (UID: "9c677d7a-2716-4c8d-8d87-7c158ca5de6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.200907 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vrld\" (UniqueName: \"kubernetes.io/projected/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-kube-api-access-8vrld\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.200968 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.200978 4891 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.200986 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c677d7a-2716-4c8d-8d87-7c158ca5de6c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.416837 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec95c81-cfd3-46a1-befc-581ea6a57bc3" path="/var/lib/kubelet/pods/2ec95c81-cfd3-46a1-befc-581ea6a57bc3/volumes" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.417880 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a561e40-cae5-4c35-9f8b-9424f4aa61a5" path="/var/lib/kubelet/pods/9a561e40-cae5-4c35-9f8b-9424f4aa61a5/volumes" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.460008 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" event={"ID":"9c677d7a-2716-4c8d-8d87-7c158ca5de6c","Type":"ContainerDied","Data":"e24a2b7196159a8f823ea2b651e31c8d26bc138fadb0210c69beb18215f0faaf"} Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.460121 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e24a2b7196159a8f823ea2b651e31c8d26bc138fadb0210c69beb18215f0faaf" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.460225 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.543565 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr"] Sep 29 10:14:26 crc kubenswrapper[4891]: E0929 10:14:26.544049 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="registry-server" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544068 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="registry-server" Sep 29 10:14:26 crc kubenswrapper[4891]: E0929 10:14:26.544107 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="extract-content" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544115 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="extract-content" Sep 29 10:14:26 crc kubenswrapper[4891]: E0929 10:14:26.544136 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c677d7a-2716-4c8d-8d87-7c158ca5de6c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544146 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c677d7a-2716-4c8d-8d87-7c158ca5de6c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:26 crc kubenswrapper[4891]: E0929 10:14:26.544165 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="extract-utilities" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544172 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="extract-utilities" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544397 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c677d7a-2716-4c8d-8d87-7c158ca5de6c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.544428 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="412b64da-3862-4b4d-9afd-2cde2e423d74" containerName="registry-server" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.545206 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.548996 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.549025 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.549388 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.549557 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.568512 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr"] Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.610893 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.611064 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.611264 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktqsh\" (UniqueName: \"kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.713182 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.713510 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.713867 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktqsh\" (UniqueName: \"kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.718534 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.719016 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.746825 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktqsh\" (UniqueName: \"kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:26 crc kubenswrapper[4891]: I0929 10:14:26.877420 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:14:27 crc kubenswrapper[4891]: W0929 10:14:27.446090 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd3f0561_9568_4116_b84e_1209c964e50f.slice/crio-0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724 WatchSource:0}: Error finding container 0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724: Status 404 returned error can't find the container with id 0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724 Sep 29 10:14:27 crc kubenswrapper[4891]: I0929 10:14:27.449812 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:14:27 crc kubenswrapper[4891]: I0929 10:14:27.449897 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr"] Sep 29 10:14:27 crc kubenswrapper[4891]: I0929 10:14:27.475364 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" event={"ID":"bd3f0561-9568-4116-b84e-1209c964e50f","Type":"ContainerStarted","Data":"0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724"} Sep 29 10:14:28 crc kubenswrapper[4891]: I0929 10:14:28.487227 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" event={"ID":"bd3f0561-9568-4116-b84e-1209c964e50f","Type":"ContainerStarted","Data":"01d65f912963370527eb8e9af1017da5dabbf0885fe3dde8808ec8fb1f92b462"} Sep 29 10:14:28 crc kubenswrapper[4891]: I0929 10:14:28.505313 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" podStartSLOduration=1.9060844110000001 podStartE2EDuration="2.505285156s" podCreationTimestamp="2025-09-29 10:14:26 +0000 UTC" firstStartedPulling="2025-09-29 10:14:27.449571036 +0000 UTC m=+1597.654739357" lastFinishedPulling="2025-09-29 10:14:28.048771781 +0000 UTC m=+1598.253940102" observedRunningTime="2025-09-29 10:14:28.503302039 +0000 UTC m=+1598.708470370" watchObservedRunningTime="2025-09-29 10:14:28.505285156 +0000 UTC m=+1598.710453517" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.029116 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6d62-account-create-dnmtw"] Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.039341 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-83c1-account-create-td9vd"] Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.047970 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6d62-account-create-dnmtw"] Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.055011 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-83c1-account-create-td9vd"] Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.186146 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.186209 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.186259 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.187006 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.187065 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" gracePeriod=600 Sep 29 10:14:36 crc kubenswrapper[4891]: E0929 10:14:36.308874 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.406441 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cc69b2-dc90-486e-8d87-3ff906ee4288" path="/var/lib/kubelet/pods/30cc69b2-dc90-486e-8d87-3ff906ee4288/volumes" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.406969 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac4d0d2-6597-4879-818c-7d0748094e3e" path="/var/lib/kubelet/pods/9ac4d0d2-6597-4879-818c-7d0748094e3e/volumes" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.580371 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" exitCode=0 Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.580423 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5"} Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.580467 4891 scope.go:117] "RemoveContainer" containerID="d8a4450f0cf100e3bb069af7451326e6487159492cf29cc8115c4ea917d31960" Sep 29 10:14:36 crc kubenswrapper[4891]: I0929 10:14:36.581152 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:14:36 crc kubenswrapper[4891]: E0929 10:14:36.581392 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:14:37 crc kubenswrapper[4891]: I0929 10:14:37.034674 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c87a-account-create-zvx6w"] Sep 29 10:14:37 crc kubenswrapper[4891]: I0929 10:14:37.046301 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c87a-account-create-zvx6w"] Sep 29 10:14:38 crc kubenswrapper[4891]: I0929 10:14:38.419179 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f29b0d-c5e6-48ea-b1b0-01649ee7ec80" path="/var/lib/kubelet/pods/84f29b0d-c5e6-48ea-b1b0-01649ee7ec80/volumes" Sep 29 10:14:47 crc kubenswrapper[4891]: I0929 10:14:47.396231 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:14:47 crc kubenswrapper[4891]: E0929 10:14:47.397183 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:14:59 crc kubenswrapper[4891]: I0929 10:14:59.062730 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-cp9f2"] Sep 29 10:14:59 crc kubenswrapper[4891]: I0929 10:14:59.075641 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-cp9f2"] Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.171005 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k"] Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.173807 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.176540 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.176996 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.186709 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k"] Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.338458 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhkh\" (UniqueName: \"kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.338569 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.338681 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.409845 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c99c95-cf39-4352-b4c0-25c0ac4b6465" path="/var/lib/kubelet/pods/11c99c95-cf39-4352-b4c0-25c0ac4b6465/volumes" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.441719 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.442339 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.442603 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhkh\" (UniqueName: \"kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.443622 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.449987 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.466588 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhkh\" (UniqueName: \"kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh\") pod \"collect-profiles-29319015-9bq8k\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.511705 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:00 crc kubenswrapper[4891]: I0929 10:15:00.986846 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k"] Sep 29 10:15:01 crc kubenswrapper[4891]: I0929 10:15:01.396663 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:15:01 crc kubenswrapper[4891]: E0929 10:15:01.397017 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:15:01 crc kubenswrapper[4891]: I0929 10:15:01.922769 4891 generic.go:334] "Generic (PLEG): container finished" podID="8613c691-341f-438e-89c9-2ac670aff380" containerID="df696df1ce7ad43e752d8216bd674b6b1f7a0ac6c96f4f112dbcbf4738cf6625" exitCode=0 Sep 29 10:15:01 crc kubenswrapper[4891]: I0929 10:15:01.922823 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" event={"ID":"8613c691-341f-438e-89c9-2ac670aff380","Type":"ContainerDied","Data":"df696df1ce7ad43e752d8216bd674b6b1f7a0ac6c96f4f112dbcbf4738cf6625"} Sep 29 10:15:01 crc kubenswrapper[4891]: I0929 10:15:01.922861 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" event={"ID":"8613c691-341f-438e-89c9-2ac670aff380","Type":"ContainerStarted","Data":"e3f77468eb9698bccd6bdae409423fe20599000f9b5a14fe3925768c41c47163"} Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.321653 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.410191 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnhkh\" (UniqueName: \"kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh\") pod \"8613c691-341f-438e-89c9-2ac670aff380\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.410389 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume\") pod \"8613c691-341f-438e-89c9-2ac670aff380\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.410453 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume\") pod \"8613c691-341f-438e-89c9-2ac670aff380\" (UID: \"8613c691-341f-438e-89c9-2ac670aff380\") " Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.411579 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume" (OuterVolumeSpecName: "config-volume") pod "8613c691-341f-438e-89c9-2ac670aff380" (UID: "8613c691-341f-438e-89c9-2ac670aff380"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.418389 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8613c691-341f-438e-89c9-2ac670aff380" (UID: "8613c691-341f-438e-89c9-2ac670aff380"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.422225 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh" (OuterVolumeSpecName: "kube-api-access-xnhkh") pod "8613c691-341f-438e-89c9-2ac670aff380" (UID: "8613c691-341f-438e-89c9-2ac670aff380"). InnerVolumeSpecName "kube-api-access-xnhkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.513410 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnhkh\" (UniqueName: \"kubernetes.io/projected/8613c691-341f-438e-89c9-2ac670aff380-kube-api-access-xnhkh\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.513480 4891 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8613c691-341f-438e-89c9-2ac670aff380-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.513503 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8613c691-341f-438e-89c9-2ac670aff380-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.941768 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" event={"ID":"8613c691-341f-438e-89c9-2ac670aff380","Type":"ContainerDied","Data":"e3f77468eb9698bccd6bdae409423fe20599000f9b5a14fe3925768c41c47163"} Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.941828 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f77468eb9698bccd6bdae409423fe20599000f9b5a14fe3925768c41c47163" Sep 29 10:15:03 crc kubenswrapper[4891]: I0929 10:15:03.941884 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-9bq8k" Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.050377 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9dsgv"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.063285 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2bp6p"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.076575 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-58p75"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.086151 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2bp6p"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.099243 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9dsgv"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.110834 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-58p75"] Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.411434 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470b84f7-4a21-43c5-8770-f252f3e9bf6c" path="/var/lib/kubelet/pods/470b84f7-4a21-43c5-8770-f252f3e9bf6c/volumes" Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.412140 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97300203-1ba7-435f-9567-db7ebe1f6234" path="/var/lib/kubelet/pods/97300203-1ba7-435f-9567-db7ebe1f6234/volumes" Sep 29 10:15:04 crc kubenswrapper[4891]: I0929 10:15:04.412768 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b62e005-e6a8-4385-9795-54b88491fab1" path="/var/lib/kubelet/pods/9b62e005-e6a8-4385-9795-54b88491fab1/volumes" Sep 29 10:15:12 crc kubenswrapper[4891]: I0929 10:15:12.397061 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:15:12 crc kubenswrapper[4891]: E0929 10:15:12.397894 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:15:13 crc kubenswrapper[4891]: I0929 10:15:13.038407 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8a72-account-create-wqxzw"] Sep 29 10:15:13 crc kubenswrapper[4891]: I0929 10:15:13.057088 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ccba-account-create-cdq25"] Sep 29 10:15:13 crc kubenswrapper[4891]: I0929 10:15:13.065941 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8a72-account-create-wqxzw"] Sep 29 10:15:13 crc kubenswrapper[4891]: I0929 10:15:13.073673 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ccba-account-create-cdq25"] Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.035757 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2bbjd"] Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.047083 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2bbjd"] Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.418286 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e83cc3b-a436-4846-9246-1e1dec8e85cc" path="/var/lib/kubelet/pods/7e83cc3b-a436-4846-9246-1e1dec8e85cc/volumes" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.420155 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f582bd0-7e30-4fef-8623-9d0482c4aa7b" path="/var/lib/kubelet/pods/7f582bd0-7e30-4fef-8623-9d0482c4aa7b/volumes" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.421416 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f670344-0a7d-4c50-ad4d-e00195b3f232" path="/var/lib/kubelet/pods/9f670344-0a7d-4c50-ad4d-e00195b3f232/volumes" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.761376 4891 scope.go:117] "RemoveContainer" containerID="51ee6c62107e38e82f6f0d3462fa457544eb4d0e29f27540795bc5736c8208f8" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.795661 4891 scope.go:117] "RemoveContainer" containerID="74432fd4470f11739ef6600a7c239f5ea978893cc33261af3c2634446e285ed5" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.847339 4891 scope.go:117] "RemoveContainer" containerID="4442c7ee92e3b29486eece5f6bf2f6429ca7849604a4f6e5195602ae26b0fd56" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.897420 4891 scope.go:117] "RemoveContainer" containerID="0637e2fb45d8c9d1f0ccc2cb7817d4bd647859ebe52a35b0cbc9eee3a3593351" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.977528 4891 scope.go:117] "RemoveContainer" containerID="497efa1909363164e1611b21306d4443d79bd885cb159a68bb97c942238be1dd" Sep 29 10:15:14 crc kubenswrapper[4891]: I0929 10:15:14.998516 4891 scope.go:117] "RemoveContainer" containerID="9ab1bf7e06c3a6c07df39378e627acc0e03a039a5b5d414ca236df7617c23ffe" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.048375 4891 scope.go:117] "RemoveContainer" containerID="8b51737e23f5d950b90347cf88885c6bf6d7bb7ecbf315ca425c32e06843c4dd" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.075234 4891 scope.go:117] "RemoveContainer" containerID="090bddd3e5abebc806e3514e75503344a5b073460904eb800faf5ff00037864f" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.103160 4891 scope.go:117] "RemoveContainer" containerID="77b59f7de13b33b0804cfbec8ef4aab4efbb2e8a838fa795c94826f8fc46bc5e" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.123220 4891 scope.go:117] "RemoveContainer" containerID="bfb9fcc0ae82a908049fcb06cc95c98d70cc4cf7fcdaf9366bb6d51bdc60cde4" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.143541 4891 scope.go:117] "RemoveContainer" containerID="abf731f434a664062034377e3969182518ca28e110d4516277c631c1140b6297" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.183778 4891 scope.go:117] "RemoveContainer" containerID="500e7d112bcc7de537acd25ecbcaeb58ef79020b66a6f7e14c54bddd259219a3" Sep 29 10:15:15 crc kubenswrapper[4891]: I0929 10:15:15.207068 4891 scope.go:117] "RemoveContainer" containerID="c81f78241212772aa95f0f2d22a048f7b7f9bd391b893af6e6ff9b7f7c2cb6fa" Sep 29 10:15:26 crc kubenswrapper[4891]: I0929 10:15:26.396947 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:15:26 crc kubenswrapper[4891]: E0929 10:15:26.398019 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:15:35 crc kubenswrapper[4891]: I0929 10:15:35.041134 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8536-account-create-sgsw4"] Sep 29 10:15:35 crc kubenswrapper[4891]: I0929 10:15:35.061666 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8536-account-create-sgsw4"] Sep 29 10:15:36 crc kubenswrapper[4891]: I0929 10:15:36.413825 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4026f9-ee20-44aa-9575-b1e64680139a" path="/var/lib/kubelet/pods/fa4026f9-ee20-44aa-9575-b1e64680139a/volumes" Sep 29 10:15:39 crc kubenswrapper[4891]: I0929 10:15:39.396914 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:15:39 crc kubenswrapper[4891]: E0929 10:15:39.397584 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:15:48 crc kubenswrapper[4891]: I0929 10:15:48.060545 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g57l7"] Sep 29 10:15:48 crc kubenswrapper[4891]: I0929 10:15:48.075976 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g57l7"] Sep 29 10:15:48 crc kubenswrapper[4891]: I0929 10:15:48.417895 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f25328-bcaa-4a33-b55b-f7a026e29087" path="/var/lib/kubelet/pods/c2f25328-bcaa-4a33-b55b-f7a026e29087/volumes" Sep 29 10:15:49 crc kubenswrapper[4891]: I0929 10:15:49.036924 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bm7bg"] Sep 29 10:15:49 crc kubenswrapper[4891]: I0929 10:15:49.052929 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bm7bg"] Sep 29 10:15:50 crc kubenswrapper[4891]: I0929 10:15:50.417894 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e641569b-322f-4157-aaf2-44d5f700234d" path="/var/lib/kubelet/pods/e641569b-322f-4157-aaf2-44d5f700234d/volumes" Sep 29 10:15:53 crc kubenswrapper[4891]: I0929 10:15:53.396341 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:15:53 crc kubenswrapper[4891]: E0929 10:15:53.397089 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:16:04 crc kubenswrapper[4891]: I0929 10:16:04.396074 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:16:04 crc kubenswrapper[4891]: E0929 10:16:04.396728 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:16:07 crc kubenswrapper[4891]: I0929 10:16:07.046021 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-s4bk8"] Sep 29 10:16:07 crc kubenswrapper[4891]: I0929 10:16:07.058230 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-s4bk8"] Sep 29 10:16:08 crc kubenswrapper[4891]: I0929 10:16:08.411920 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5872fd60-b8c9-4f00-8c9a-679960a32e27" path="/var/lib/kubelet/pods/5872fd60-b8c9-4f00-8c9a-679960a32e27/volumes" Sep 29 10:16:10 crc kubenswrapper[4891]: I0929 10:16:10.702072 4891 generic.go:334] "Generic (PLEG): container finished" podID="bd3f0561-9568-4116-b84e-1209c964e50f" containerID="01d65f912963370527eb8e9af1017da5dabbf0885fe3dde8808ec8fb1f92b462" exitCode=0 Sep 29 10:16:10 crc kubenswrapper[4891]: I0929 10:16:10.702127 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" event={"ID":"bd3f0561-9568-4116-b84e-1209c964e50f","Type":"ContainerDied","Data":"01d65f912963370527eb8e9af1017da5dabbf0885fe3dde8808ec8fb1f92b462"} Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.034930 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jv6jw"] Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.041293 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jv6jw"] Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.122810 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.209844 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory\") pod \"bd3f0561-9568-4116-b84e-1209c964e50f\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.209977 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key\") pod \"bd3f0561-9568-4116-b84e-1209c964e50f\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.210182 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktqsh\" (UniqueName: \"kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh\") pod \"bd3f0561-9568-4116-b84e-1209c964e50f\" (UID: \"bd3f0561-9568-4116-b84e-1209c964e50f\") " Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.221906 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh" (OuterVolumeSpecName: "kube-api-access-ktqsh") pod "bd3f0561-9568-4116-b84e-1209c964e50f" (UID: "bd3f0561-9568-4116-b84e-1209c964e50f"). InnerVolumeSpecName "kube-api-access-ktqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.256358 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory" (OuterVolumeSpecName: "inventory") pod "bd3f0561-9568-4116-b84e-1209c964e50f" (UID: "bd3f0561-9568-4116-b84e-1209c964e50f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.256396 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd3f0561-9568-4116-b84e-1209c964e50f" (UID: "bd3f0561-9568-4116-b84e-1209c964e50f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.315020 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.315064 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd3f0561-9568-4116-b84e-1209c964e50f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.315077 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktqsh\" (UniqueName: \"kubernetes.io/projected/bd3f0561-9568-4116-b84e-1209c964e50f-kube-api-access-ktqsh\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.412588 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb88c2dd-0bb3-4425-842f-b697d51f8273" path="/var/lib/kubelet/pods/fb88c2dd-0bb3-4425-842f-b697d51f8273/volumes" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.728107 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" event={"ID":"bd3f0561-9568-4116-b84e-1209c964e50f","Type":"ContainerDied","Data":"0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724"} Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.728161 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0223e1797efd51ad3f708b92b0be5ed6bbf56732742e9ef25f1d09196f9b1724" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.728205 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.829216 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t"] Sep 29 10:16:12 crc kubenswrapper[4891]: E0929 10:16:12.830045 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3f0561-9568-4116-b84e-1209c964e50f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.830168 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3f0561-9568-4116-b84e-1209c964e50f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:12 crc kubenswrapper[4891]: E0929 10:16:12.830270 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613c691-341f-438e-89c9-2ac670aff380" containerName="collect-profiles" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.830343 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613c691-341f-438e-89c9-2ac670aff380" containerName="collect-profiles" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.830714 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3f0561-9568-4116-b84e-1209c964e50f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.830908 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613c691-341f-438e-89c9-2ac670aff380" containerName="collect-profiles" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.831946 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.838202 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.839282 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.840433 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.846281 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t"] Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.847442 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.927416 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.927504 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjlwq\" (UniqueName: \"kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:12 crc kubenswrapper[4891]: I0929 10:16:12.927629 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.029559 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.029841 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.029954 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjlwq\" (UniqueName: \"kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.032988 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.033444 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.055082 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjlwq\" (UniqueName: \"kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.197381 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:16:13 crc kubenswrapper[4891]: I0929 10:16:13.754672 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t"] Sep 29 10:16:14 crc kubenswrapper[4891]: I0929 10:16:14.746908 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" event={"ID":"677e5a8c-37d1-41a1-bd47-2ef7af3a3570","Type":"ContainerStarted","Data":"5f2889bb019848517f851a10bbd39816cade8c249e49081471f192119f785a72"} Sep 29 10:16:14 crc kubenswrapper[4891]: I0929 10:16:14.747254 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" event={"ID":"677e5a8c-37d1-41a1-bd47-2ef7af3a3570","Type":"ContainerStarted","Data":"b636498a689029ec6650ae3f647daa5fed7211a9e60505d2955ddb37ef4c925b"} Sep 29 10:16:14 crc kubenswrapper[4891]: I0929 10:16:14.772711 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" podStartSLOduration=2.247092647 podStartE2EDuration="2.772669362s" podCreationTimestamp="2025-09-29 10:16:12 +0000 UTC" firstStartedPulling="2025-09-29 10:16:13.766100287 +0000 UTC m=+1703.971268628" lastFinishedPulling="2025-09-29 10:16:14.291676982 +0000 UTC m=+1704.496845343" observedRunningTime="2025-09-29 10:16:14.766383459 +0000 UTC m=+1704.971551800" watchObservedRunningTime="2025-09-29 10:16:14.772669362 +0000 UTC m=+1704.977837683" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.397245 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:16:15 crc kubenswrapper[4891]: E0929 10:16:15.398329 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.481810 4891 scope.go:117] "RemoveContainer" containerID="037c02798e133e92a31e44733436f13ca27a498b8569fab2b48c805aa0594c30" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.508770 4891 scope.go:117] "RemoveContainer" containerID="05976b9244536aaf671ae008bfa47e8488858cf9bd923da3799ab7987da88fc2" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.554688 4891 scope.go:117] "RemoveContainer" containerID="5ea6368b877c7f0c81501f74137a9cefa3020a0725510ca09dfdf7ff6ab644da" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.624145 4891 scope.go:117] "RemoveContainer" containerID="39b5379b9aec7726d5eed8e1576b89438e7edade0f4fd10f0a0b4471d6fd8b6b" Sep 29 10:16:15 crc kubenswrapper[4891]: I0929 10:16:15.684093 4891 scope.go:117] "RemoveContainer" containerID="d53bf5890f49da04046f8b0e6e3e76b26c13f583a8914579e66d0cb000c17d7a" Sep 29 10:16:26 crc kubenswrapper[4891]: I0929 10:16:26.396841 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:16:26 crc kubenswrapper[4891]: E0929 10:16:26.397612 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:16:37 crc kubenswrapper[4891]: I0929 10:16:37.051412 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hgm8s"] Sep 29 10:16:37 crc kubenswrapper[4891]: I0929 10:16:37.059091 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hgm8s"] Sep 29 10:16:38 crc kubenswrapper[4891]: I0929 10:16:38.408154 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dd6438-e338-4dce-b2be-0e36b359631c" path="/var/lib/kubelet/pods/f7dd6438-e338-4dce-b2be-0e36b359631c/volumes" Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.050874 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bbtmc"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.059747 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p9kt4"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.070815 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kzk72"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.080323 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p9kt4"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.090383 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bbtmc"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.099198 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kzk72"] Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.481448 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:16:40 crc kubenswrapper[4891]: E0929 10:16:40.482009 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.482682 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0718b8fe-fc19-46ae-8c7f-bdd9908c9730" path="/var/lib/kubelet/pods/0718b8fe-fc19-46ae-8c7f-bdd9908c9730/volumes" Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.484987 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11722413-5550-4d6a-b328-f57f26166791" path="/var/lib/kubelet/pods/11722413-5550-4d6a-b328-f57f26166791/volumes" Sep 29 10:16:40 crc kubenswrapper[4891]: I0929 10:16:40.490204 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc047cc9-f61d-4575-a8bd-d6b69aa77701" path="/var/lib/kubelet/pods/fc047cc9-f61d-4575-a8bd-d6b69aa77701/volumes" Sep 29 10:16:47 crc kubenswrapper[4891]: I0929 10:16:47.033317 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-86a2-account-create-dtlnc"] Sep 29 10:16:47 crc kubenswrapper[4891]: I0929 10:16:47.042086 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-86a2-account-create-dtlnc"] Sep 29 10:16:48 crc kubenswrapper[4891]: I0929 10:16:48.412760 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43d7587-9a50-41b6-9e8b-93b2e613c7c3" path="/var/lib/kubelet/pods/f43d7587-9a50-41b6-9e8b-93b2e613c7c3/volumes" Sep 29 10:16:49 crc kubenswrapper[4891]: I0929 10:16:49.034058 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9552-account-create-l4fsm"] Sep 29 10:16:49 crc kubenswrapper[4891]: I0929 10:16:49.043882 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b627-account-create-hdmpf"] Sep 29 10:16:49 crc kubenswrapper[4891]: I0929 10:16:49.055003 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9552-account-create-l4fsm"] Sep 29 10:16:49 crc kubenswrapper[4891]: I0929 10:16:49.061015 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b627-account-create-hdmpf"] Sep 29 10:16:50 crc kubenswrapper[4891]: I0929 10:16:50.410297 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1788415f-6215-4d2e-925b-bacb13797579" path="/var/lib/kubelet/pods/1788415f-6215-4d2e-925b-bacb13797579/volumes" Sep 29 10:16:50 crc kubenswrapper[4891]: I0929 10:16:50.413013 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ca376b-baea-4fb6-94ce-eb3def80c06e" path="/var/lib/kubelet/pods/a1ca376b-baea-4fb6-94ce-eb3def80c06e/volumes" Sep 29 10:16:55 crc kubenswrapper[4891]: I0929 10:16:55.396333 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:16:55 crc kubenswrapper[4891]: E0929 10:16:55.397494 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:17:10 crc kubenswrapper[4891]: I0929 10:17:10.396062 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:17:10 crc kubenswrapper[4891]: E0929 10:17:10.396818 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:17:15 crc kubenswrapper[4891]: I0929 10:17:15.860235 4891 scope.go:117] "RemoveContainer" containerID="65271b9cb8a5b2fd2dd5a4ba84fdeb674ef6fe6d3e09305dd5af33edf0ad69cd" Sep 29 10:17:15 crc kubenswrapper[4891]: I0929 10:17:15.890588 4891 scope.go:117] "RemoveContainer" containerID="311ce8af5d677477c40eb7933b86dced495a406b0d7075bdeff78e892d32efd1" Sep 29 10:17:15 crc kubenswrapper[4891]: I0929 10:17:15.950309 4891 scope.go:117] "RemoveContainer" containerID="fc2abc4dd9c52bfb8d12a419e65f4e47289e232baab2282213e00b71966e8fe1" Sep 29 10:17:15 crc kubenswrapper[4891]: I0929 10:17:15.999830 4891 scope.go:117] "RemoveContainer" containerID="31e52820b7dec217715b8c9fb4934b0d20c89e9419a82a26be2c31565a0a1edb" Sep 29 10:17:16 crc kubenswrapper[4891]: I0929 10:17:16.053926 4891 scope.go:117] "RemoveContainer" containerID="c28443f9e7a2b71f9692eaaa2c85a35d227f85bfaefc93933ca38d04d8f8d31a" Sep 29 10:17:16 crc kubenswrapper[4891]: I0929 10:17:16.126608 4891 scope.go:117] "RemoveContainer" containerID="6a57541754cdf69b2f3908fae36525ce3f9ee5bef4fd14b7a5e1c5e9fc0dfbd7" Sep 29 10:17:16 crc kubenswrapper[4891]: I0929 10:17:16.158101 4891 scope.go:117] "RemoveContainer" containerID="3cf5c0f51126bb73d321dd15082b5342d17bee079ca62c8cc243358139c5e88d" Sep 29 10:17:20 crc kubenswrapper[4891]: I0929 10:17:20.041861 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jt2hn"] Sep 29 10:17:20 crc kubenswrapper[4891]: I0929 10:17:20.049325 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jt2hn"] Sep 29 10:17:20 crc kubenswrapper[4891]: I0929 10:17:20.423071 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8acd86-ee16-42c5-9309-7651699a0886" path="/var/lib/kubelet/pods/2f8acd86-ee16-42c5-9309-7651699a0886/volumes" Sep 29 10:17:23 crc kubenswrapper[4891]: I0929 10:17:23.396941 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:17:23 crc kubenswrapper[4891]: E0929 10:17:23.399196 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:17:34 crc kubenswrapper[4891]: I0929 10:17:34.396637 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:17:34 crc kubenswrapper[4891]: E0929 10:17:34.397624 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:17:34 crc kubenswrapper[4891]: I0929 10:17:34.571484 4891 generic.go:334] "Generic (PLEG): container finished" podID="677e5a8c-37d1-41a1-bd47-2ef7af3a3570" containerID="5f2889bb019848517f851a10bbd39816cade8c249e49081471f192119f785a72" exitCode=0 Sep 29 10:17:34 crc kubenswrapper[4891]: I0929 10:17:34.571543 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" event={"ID":"677e5a8c-37d1-41a1-bd47-2ef7af3a3570","Type":"ContainerDied","Data":"5f2889bb019848517f851a10bbd39816cade8c249e49081471f192119f785a72"} Sep 29 10:17:35 crc kubenswrapper[4891]: I0929 10:17:35.983707 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.091861 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjlwq\" (UniqueName: \"kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq\") pod \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.092028 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory\") pod \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.092067 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key\") pod \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\" (UID: \"677e5a8c-37d1-41a1-bd47-2ef7af3a3570\") " Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.097581 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq" (OuterVolumeSpecName: "kube-api-access-pjlwq") pod "677e5a8c-37d1-41a1-bd47-2ef7af3a3570" (UID: "677e5a8c-37d1-41a1-bd47-2ef7af3a3570"). InnerVolumeSpecName "kube-api-access-pjlwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.129108 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "677e5a8c-37d1-41a1-bd47-2ef7af3a3570" (UID: "677e5a8c-37d1-41a1-bd47-2ef7af3a3570"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.151042 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory" (OuterVolumeSpecName: "inventory") pod "677e5a8c-37d1-41a1-bd47-2ef7af3a3570" (UID: "677e5a8c-37d1-41a1-bd47-2ef7af3a3570"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.193424 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.193453 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.193463 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjlwq\" (UniqueName: \"kubernetes.io/projected/677e5a8c-37d1-41a1-bd47-2ef7af3a3570-kube-api-access-pjlwq\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.589706 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" event={"ID":"677e5a8c-37d1-41a1-bd47-2ef7af3a3570","Type":"ContainerDied","Data":"b636498a689029ec6650ae3f647daa5fed7211a9e60505d2955ddb37ef4c925b"} Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.589746 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b636498a689029ec6650ae3f647daa5fed7211a9e60505d2955ddb37ef4c925b" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.589770 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.689826 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs"] Sep 29 10:17:36 crc kubenswrapper[4891]: E0929 10:17:36.690300 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677e5a8c-37d1-41a1-bd47-2ef7af3a3570" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.690321 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="677e5a8c-37d1-41a1-bd47-2ef7af3a3570" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.690564 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="677e5a8c-37d1-41a1-bd47-2ef7af3a3570" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.691331 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.693462 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.693966 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.695022 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.695029 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.708081 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jg28\" (UniqueName: \"kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.708255 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.714157 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs"] Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.719418 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.822272 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg28\" (UniqueName: \"kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.822362 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.822539 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.828454 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.828493 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:36 crc kubenswrapper[4891]: I0929 10:17:36.843954 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jg28\" (UniqueName: \"kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:37 crc kubenswrapper[4891]: I0929 10:17:37.019208 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:37 crc kubenswrapper[4891]: I0929 10:17:37.531800 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs"] Sep 29 10:17:37 crc kubenswrapper[4891]: I0929 10:17:37.598833 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" event={"ID":"a4406439-b507-4572-b458-58d0ddf2b94d","Type":"ContainerStarted","Data":"923b1bc676b3c7d0f75e40d2d1b0075744f0ce6e8eb790a3e6d830baa86cf21d"} Sep 29 10:17:39 crc kubenswrapper[4891]: I0929 10:17:39.051429 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-d7m4m"] Sep 29 10:17:39 crc kubenswrapper[4891]: I0929 10:17:39.060882 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-d7m4m"] Sep 29 10:17:39 crc kubenswrapper[4891]: I0929 10:17:39.627613 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" event={"ID":"a4406439-b507-4572-b458-58d0ddf2b94d","Type":"ContainerStarted","Data":"026616d1a7d05ce48a506eb13e428c6e571abb89597f1c18fe93161c52065dbd"} Sep 29 10:17:39 crc kubenswrapper[4891]: I0929 10:17:39.647242 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" podStartSLOduration=2.470196813 podStartE2EDuration="3.647223621s" podCreationTimestamp="2025-09-29 10:17:36 +0000 UTC" firstStartedPulling="2025-09-29 10:17:37.537257929 +0000 UTC m=+1787.742426260" lastFinishedPulling="2025-09-29 10:17:38.714284747 +0000 UTC m=+1788.919453068" observedRunningTime="2025-09-29 10:17:39.642454493 +0000 UTC m=+1789.847622834" watchObservedRunningTime="2025-09-29 10:17:39.647223621 +0000 UTC m=+1789.852391952" Sep 29 10:17:40 crc kubenswrapper[4891]: I0929 10:17:40.429679 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17acc18-5c95-4d7c-9576-ba976472f02d" path="/var/lib/kubelet/pods/b17acc18-5c95-4d7c-9576-ba976472f02d/volumes" Sep 29 10:17:41 crc kubenswrapper[4891]: I0929 10:17:41.027229 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kq9cj"] Sep 29 10:17:41 crc kubenswrapper[4891]: I0929 10:17:41.034858 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kq9cj"] Sep 29 10:17:42 crc kubenswrapper[4891]: I0929 10:17:42.407867 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8148fa-ad96-4f4b-8910-7233808ce733" path="/var/lib/kubelet/pods/8c8148fa-ad96-4f4b-8910-7233808ce733/volumes" Sep 29 10:17:44 crc kubenswrapper[4891]: I0929 10:17:44.675629 4891 generic.go:334] "Generic (PLEG): container finished" podID="a4406439-b507-4572-b458-58d0ddf2b94d" containerID="026616d1a7d05ce48a506eb13e428c6e571abb89597f1c18fe93161c52065dbd" exitCode=0 Sep 29 10:17:44 crc kubenswrapper[4891]: I0929 10:17:44.675670 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" event={"ID":"a4406439-b507-4572-b458-58d0ddf2b94d","Type":"ContainerDied","Data":"026616d1a7d05ce48a506eb13e428c6e571abb89597f1c18fe93161c52065dbd"} Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.123576 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.200012 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory\") pod \"a4406439-b507-4572-b458-58d0ddf2b94d\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.200116 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") pod \"a4406439-b507-4572-b458-58d0ddf2b94d\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.200162 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jg28\" (UniqueName: \"kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28\") pod \"a4406439-b507-4572-b458-58d0ddf2b94d\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.220761 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28" (OuterVolumeSpecName: "kube-api-access-9jg28") pod "a4406439-b507-4572-b458-58d0ddf2b94d" (UID: "a4406439-b507-4572-b458-58d0ddf2b94d"). InnerVolumeSpecName "kube-api-access-9jg28". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:17:46 crc kubenswrapper[4891]: E0929 10:17:46.233339 4891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key podName:a4406439-b507-4572-b458-58d0ddf2b94d nodeName:}" failed. No retries permitted until 2025-09-29 10:17:46.733311746 +0000 UTC m=+1796.938480077 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key") pod "a4406439-b507-4572-b458-58d0ddf2b94d" (UID: "a4406439-b507-4572-b458-58d0ddf2b94d") : error deleting /var/lib/kubelet/pods/a4406439-b507-4572-b458-58d0ddf2b94d/volume-subpaths: remove /var/lib/kubelet/pods/a4406439-b507-4572-b458-58d0ddf2b94d/volume-subpaths: no such file or directory Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.236431 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory" (OuterVolumeSpecName: "inventory") pod "a4406439-b507-4572-b458-58d0ddf2b94d" (UID: "a4406439-b507-4572-b458-58d0ddf2b94d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.302581 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jg28\" (UniqueName: \"kubernetes.io/projected/a4406439-b507-4572-b458-58d0ddf2b94d-kube-api-access-9jg28\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.302633 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.697542 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" event={"ID":"a4406439-b507-4572-b458-58d0ddf2b94d","Type":"ContainerDied","Data":"923b1bc676b3c7d0f75e40d2d1b0075744f0ce6e8eb790a3e6d830baa86cf21d"} Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.697827 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="923b1bc676b3c7d0f75e40d2d1b0075744f0ce6e8eb790a3e6d830baa86cf21d" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.697678 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.795408 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc"] Sep 29 10:17:46 crc kubenswrapper[4891]: E0929 10:17:46.795996 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4406439-b507-4572-b458-58d0ddf2b94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.796024 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4406439-b507-4572-b458-58d0ddf2b94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.796315 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4406439-b507-4572-b458-58d0ddf2b94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.797138 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.802361 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc"] Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.811550 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") pod \"a4406439-b507-4572-b458-58d0ddf2b94d\" (UID: \"a4406439-b507-4572-b458-58d0ddf2b94d\") " Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.840278 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4406439-b507-4572-b458-58d0ddf2b94d" (UID: "a4406439-b507-4572-b458-58d0ddf2b94d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.913772 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.913915 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlwr2\" (UniqueName: \"kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.914063 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:46 crc kubenswrapper[4891]: I0929 10:17:46.914184 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4406439-b507-4572-b458-58d0ddf2b94d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.015437 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.015545 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.015627 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlwr2\" (UniqueName: \"kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.019876 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.020182 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.032862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlwr2\" (UniqueName: \"kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6kwtc\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.194727 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.395869 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:17:47 crc kubenswrapper[4891]: E0929 10:17:47.396120 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:17:47 crc kubenswrapper[4891]: I0929 10:17:47.740986 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc"] Sep 29 10:17:48 crc kubenswrapper[4891]: I0929 10:17:48.720255 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" event={"ID":"b33262be-68ab-40c1-a34e-c629096460a8","Type":"ContainerStarted","Data":"d7942109fc8541ba1ab84349b6cb072a0f248631218acd7a603feb67bef516dc"} Sep 29 10:17:48 crc kubenswrapper[4891]: I0929 10:17:48.720764 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" event={"ID":"b33262be-68ab-40c1-a34e-c629096460a8","Type":"ContainerStarted","Data":"86486797d31e9a3957af6ad32428b5b03fdc0b8e524309913f41a1baa3f7b45a"} Sep 29 10:17:48 crc kubenswrapper[4891]: I0929 10:17:48.735376 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" podStartSLOduration=2.2535344 podStartE2EDuration="2.735350393s" podCreationTimestamp="2025-09-29 10:17:46 +0000 UTC" firstStartedPulling="2025-09-29 10:17:47.758346449 +0000 UTC m=+1797.963514770" lastFinishedPulling="2025-09-29 10:17:48.240162442 +0000 UTC m=+1798.445330763" observedRunningTime="2025-09-29 10:17:48.733628673 +0000 UTC m=+1798.938796994" watchObservedRunningTime="2025-09-29 10:17:48.735350393 +0000 UTC m=+1798.940518714" Sep 29 10:18:02 crc kubenswrapper[4891]: I0929 10:18:02.396575 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:18:02 crc kubenswrapper[4891]: E0929 10:18:02.397735 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:18:16 crc kubenswrapper[4891]: I0929 10:18:16.292057 4891 scope.go:117] "RemoveContainer" containerID="2d45602c0835c15b231030f2d961822d224fdeb285a557b60e5f0d9bf637b68c" Sep 29 10:18:16 crc kubenswrapper[4891]: I0929 10:18:16.365551 4891 scope.go:117] "RemoveContainer" containerID="c9c9b0e6401067a374ff2dd6c76854ccab5f1fcd5c405c04588ebd8e17ef8ea4" Sep 29 10:18:16 crc kubenswrapper[4891]: I0929 10:18:16.396157 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:18:16 crc kubenswrapper[4891]: E0929 10:18:16.396572 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:18:16 crc kubenswrapper[4891]: I0929 10:18:16.433873 4891 scope.go:117] "RemoveContainer" containerID="2d7ac500e158e103dd51ba49af2b87c5f48717820109d2f4561edaa95d5f6c12" Sep 29 10:18:24 crc kubenswrapper[4891]: I0929 10:18:24.074245 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4tx5q"] Sep 29 10:18:24 crc kubenswrapper[4891]: I0929 10:18:24.085518 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4tx5q"] Sep 29 10:18:24 crc kubenswrapper[4891]: I0929 10:18:24.413552 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c459e2-314b-44bd-9e6a-7ae0b907b4b7" path="/var/lib/kubelet/pods/15c459e2-314b-44bd-9e6a-7ae0b907b4b7/volumes" Sep 29 10:18:27 crc kubenswrapper[4891]: I0929 10:18:27.395566 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:18:27 crc kubenswrapper[4891]: E0929 10:18:27.396175 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:18:28 crc kubenswrapper[4891]: I0929 10:18:28.115258 4891 generic.go:334] "Generic (PLEG): container finished" podID="b33262be-68ab-40c1-a34e-c629096460a8" containerID="d7942109fc8541ba1ab84349b6cb072a0f248631218acd7a603feb67bef516dc" exitCode=0 Sep 29 10:18:28 crc kubenswrapper[4891]: I0929 10:18:28.115392 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" event={"ID":"b33262be-68ab-40c1-a34e-c629096460a8","Type":"ContainerDied","Data":"d7942109fc8541ba1ab84349b6cb072a0f248631218acd7a603feb67bef516dc"} Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.561270 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.669639 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory\") pod \"b33262be-68ab-40c1-a34e-c629096460a8\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.669746 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlwr2\" (UniqueName: \"kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2\") pod \"b33262be-68ab-40c1-a34e-c629096460a8\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.669865 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key\") pod \"b33262be-68ab-40c1-a34e-c629096460a8\" (UID: \"b33262be-68ab-40c1-a34e-c629096460a8\") " Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.676185 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2" (OuterVolumeSpecName: "kube-api-access-nlwr2") pod "b33262be-68ab-40c1-a34e-c629096460a8" (UID: "b33262be-68ab-40c1-a34e-c629096460a8"). InnerVolumeSpecName "kube-api-access-nlwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.698567 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b33262be-68ab-40c1-a34e-c629096460a8" (UID: "b33262be-68ab-40c1-a34e-c629096460a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.720387 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory" (OuterVolumeSpecName: "inventory") pod "b33262be-68ab-40c1-a34e-c629096460a8" (UID: "b33262be-68ab-40c1-a34e-c629096460a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.772965 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.773007 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlwr2\" (UniqueName: \"kubernetes.io/projected/b33262be-68ab-40c1-a34e-c629096460a8-kube-api-access-nlwr2\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:29 crc kubenswrapper[4891]: I0929 10:18:29.773029 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b33262be-68ab-40c1-a34e-c629096460a8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.133941 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" event={"ID":"b33262be-68ab-40c1-a34e-c629096460a8","Type":"ContainerDied","Data":"86486797d31e9a3957af6ad32428b5b03fdc0b8e524309913f41a1baa3f7b45a"} Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.134327 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86486797d31e9a3957af6ad32428b5b03fdc0b8e524309913f41a1baa3f7b45a" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.134010 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6kwtc" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.236701 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl"] Sep 29 10:18:30 crc kubenswrapper[4891]: E0929 10:18:30.237193 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33262be-68ab-40c1-a34e-c629096460a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.237216 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33262be-68ab-40c1-a34e-c629096460a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.237423 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33262be-68ab-40c1-a34e-c629096460a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.238106 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.240884 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.241220 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.241414 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.241593 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.255427 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl"] Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.382595 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tvr\" (UniqueName: \"kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.382701 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.382976 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.485514 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tvr\" (UniqueName: \"kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.485634 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.485716 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.491287 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.494656 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.514965 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tvr\" (UniqueName: \"kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:30 crc kubenswrapper[4891]: I0929 10:18:30.559944 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:18:31 crc kubenswrapper[4891]: I0929 10:18:31.209706 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl"] Sep 29 10:18:32 crc kubenswrapper[4891]: I0929 10:18:32.157019 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" event={"ID":"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3","Type":"ContainerStarted","Data":"c9fdc64ad1d9a45c63d322e90ab5880e0c59cefd7ab374efcbe8f2851c9b32e5"} Sep 29 10:18:32 crc kubenswrapper[4891]: I0929 10:18:32.157542 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" event={"ID":"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3","Type":"ContainerStarted","Data":"866bd046fa1431229085bfca2a3f3968bb566eae36616de1f9343033dfbdae71"} Sep 29 10:18:32 crc kubenswrapper[4891]: I0929 10:18:32.178958 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" podStartSLOduration=1.6600953459999999 podStartE2EDuration="2.178936354s" podCreationTimestamp="2025-09-29 10:18:30 +0000 UTC" firstStartedPulling="2025-09-29 10:18:31.221296093 +0000 UTC m=+1841.426464414" lastFinishedPulling="2025-09-29 10:18:31.740137101 +0000 UTC m=+1841.945305422" observedRunningTime="2025-09-29 10:18:32.173163306 +0000 UTC m=+1842.378331677" watchObservedRunningTime="2025-09-29 10:18:32.178936354 +0000 UTC m=+1842.384104675" Sep 29 10:18:40 crc kubenswrapper[4891]: I0929 10:18:40.405101 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:18:40 crc kubenswrapper[4891]: E0929 10:18:40.408174 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:18:55 crc kubenswrapper[4891]: I0929 10:18:55.397253 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:18:55 crc kubenswrapper[4891]: E0929 10:18:55.398420 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:19:10 crc kubenswrapper[4891]: I0929 10:19:10.405373 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:19:10 crc kubenswrapper[4891]: E0929 10:19:10.406290 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:19:16 crc kubenswrapper[4891]: I0929 10:19:16.572106 4891 scope.go:117] "RemoveContainer" containerID="9f78c704768ceb5f8c3bfc3e9306f10ff8c14f68a4d412f0ba7036b8319e5497" Sep 29 10:19:22 crc kubenswrapper[4891]: I0929 10:19:22.397386 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:19:22 crc kubenswrapper[4891]: E0929 10:19:22.398190 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:19:25 crc kubenswrapper[4891]: I0929 10:19:25.710130 4891 generic.go:334] "Generic (PLEG): container finished" podID="ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" containerID="c9fdc64ad1d9a45c63d322e90ab5880e0c59cefd7ab374efcbe8f2851c9b32e5" exitCode=0 Sep 29 10:19:25 crc kubenswrapper[4891]: I0929 10:19:25.710385 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" event={"ID":"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3","Type":"ContainerDied","Data":"c9fdc64ad1d9a45c63d322e90ab5880e0c59cefd7ab374efcbe8f2851c9b32e5"} Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.121775 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.166474 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key\") pod \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.166551 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory\") pod \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.166744 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tvr\" (UniqueName: \"kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr\") pod \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\" (UID: \"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3\") " Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.173990 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr" (OuterVolumeSpecName: "kube-api-access-f7tvr") pod "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" (UID: "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3"). InnerVolumeSpecName "kube-api-access-f7tvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.197155 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory" (OuterVolumeSpecName: "inventory") pod "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" (UID: "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.201453 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" (UID: "ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.268766 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tvr\" (UniqueName: \"kubernetes.io/projected/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-kube-api-access-f7tvr\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.269102 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.269114 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.732385 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" event={"ID":"ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3","Type":"ContainerDied","Data":"866bd046fa1431229085bfca2a3f3968bb566eae36616de1f9343033dfbdae71"} Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.732446 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866bd046fa1431229085bfca2a3f3968bb566eae36616de1f9343033dfbdae71" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.732456 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.827247 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-68vb5"] Sep 29 10:19:27 crc kubenswrapper[4891]: E0929 10:19:27.827848 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.827877 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.828101 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.828976 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.834489 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.834564 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.834501 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.835294 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.843904 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-68vb5"] Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.883186 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.883258 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wf2g\" (UniqueName: \"kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.883375 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.985018 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.985085 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wf2g\" (UniqueName: \"kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.985203 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.989456 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:27 crc kubenswrapper[4891]: I0929 10:19:27.989456 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:28 crc kubenswrapper[4891]: I0929 10:19:28.004490 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wf2g\" (UniqueName: \"kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g\") pod \"ssh-known-hosts-edpm-deployment-68vb5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:28 crc kubenswrapper[4891]: I0929 10:19:28.189969 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:28 crc kubenswrapper[4891]: I0929 10:19:28.699636 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-68vb5"] Sep 29 10:19:28 crc kubenswrapper[4891]: I0929 10:19:28.708577 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:19:28 crc kubenswrapper[4891]: I0929 10:19:28.744552 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" event={"ID":"e5a83354-1dda-4488-a048-16ac1b5f36f5","Type":"ContainerStarted","Data":"71c4efe553ac1423fd7f9439443b8c20dcca81e4ca7866a5a546521502d64083"} Sep 29 10:19:29 crc kubenswrapper[4891]: I0929 10:19:29.757723 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" event={"ID":"e5a83354-1dda-4488-a048-16ac1b5f36f5","Type":"ContainerStarted","Data":"68d703cddd2c41b37f1b49bc3e653daedf2bad7a141aa6f2bfe5b9f10cff8c25"} Sep 29 10:19:29 crc kubenswrapper[4891]: I0929 10:19:29.777780 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" podStartSLOduration=2.33645824 podStartE2EDuration="2.777754375s" podCreationTimestamp="2025-09-29 10:19:27 +0000 UTC" firstStartedPulling="2025-09-29 10:19:28.708291943 +0000 UTC m=+1898.913460274" lastFinishedPulling="2025-09-29 10:19:29.149588058 +0000 UTC m=+1899.354756409" observedRunningTime="2025-09-29 10:19:29.773758759 +0000 UTC m=+1899.978927100" watchObservedRunningTime="2025-09-29 10:19:29.777754375 +0000 UTC m=+1899.982922716" Sep 29 10:19:35 crc kubenswrapper[4891]: I0929 10:19:35.396890 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:19:35 crc kubenswrapper[4891]: E0929 10:19:35.398152 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:19:36 crc kubenswrapper[4891]: I0929 10:19:36.834299 4891 generic.go:334] "Generic (PLEG): container finished" podID="e5a83354-1dda-4488-a048-16ac1b5f36f5" containerID="68d703cddd2c41b37f1b49bc3e653daedf2bad7a141aa6f2bfe5b9f10cff8c25" exitCode=0 Sep 29 10:19:36 crc kubenswrapper[4891]: I0929 10:19:36.834399 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" event={"ID":"e5a83354-1dda-4488-a048-16ac1b5f36f5","Type":"ContainerDied","Data":"68d703cddd2c41b37f1b49bc3e653daedf2bad7a141aa6f2bfe5b9f10cff8c25"} Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.326142 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.399425 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0\") pod \"e5a83354-1dda-4488-a048-16ac1b5f36f5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.399785 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wf2g\" (UniqueName: \"kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g\") pod \"e5a83354-1dda-4488-a048-16ac1b5f36f5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.399911 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam\") pod \"e5a83354-1dda-4488-a048-16ac1b5f36f5\" (UID: \"e5a83354-1dda-4488-a048-16ac1b5f36f5\") " Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.405570 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g" (OuterVolumeSpecName: "kube-api-access-6wf2g") pod "e5a83354-1dda-4488-a048-16ac1b5f36f5" (UID: "e5a83354-1dda-4488-a048-16ac1b5f36f5"). InnerVolumeSpecName "kube-api-access-6wf2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.427294 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e5a83354-1dda-4488-a048-16ac1b5f36f5" (UID: "e5a83354-1dda-4488-a048-16ac1b5f36f5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.435633 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e5a83354-1dda-4488-a048-16ac1b5f36f5" (UID: "e5a83354-1dda-4488-a048-16ac1b5f36f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.503717 4891 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.503759 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wf2g\" (UniqueName: \"kubernetes.io/projected/e5a83354-1dda-4488-a048-16ac1b5f36f5-kube-api-access-6wf2g\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.503775 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5a83354-1dda-4488-a048-16ac1b5f36f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.860586 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" event={"ID":"e5a83354-1dda-4488-a048-16ac1b5f36f5","Type":"ContainerDied","Data":"71c4efe553ac1423fd7f9439443b8c20dcca81e4ca7866a5a546521502d64083"} Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.860677 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c4efe553ac1423fd7f9439443b8c20dcca81e4ca7866a5a546521502d64083" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.861178 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-68vb5" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.953413 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr"] Sep 29 10:19:38 crc kubenswrapper[4891]: E0929 10:19:38.953963 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a83354-1dda-4488-a048-16ac1b5f36f5" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.953985 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a83354-1dda-4488-a048-16ac1b5f36f5" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.954236 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a83354-1dda-4488-a048-16ac1b5f36f5" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.955082 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.958876 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.959080 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.959210 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.959305 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:19:38 crc kubenswrapper[4891]: I0929 10:19:38.964210 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr"] Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.016255 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9zs\" (UniqueName: \"kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.016330 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.016501 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.118336 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9zs\" (UniqueName: \"kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.118949 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.119900 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.124514 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.126168 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.152273 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9zs\" (UniqueName: \"kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2zlpr\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.278975 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:39 crc kubenswrapper[4891]: I0929 10:19:39.859468 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr"] Sep 29 10:19:40 crc kubenswrapper[4891]: I0929 10:19:40.882957 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" event={"ID":"92316376-b91d-4e78-ac0c-6f03f1be5f26","Type":"ContainerStarted","Data":"776d9ffd6f6346a2ed14031766452d49033193ed4b71f4f578b146d97408a53d"} Sep 29 10:19:40 crc kubenswrapper[4891]: I0929 10:19:40.883422 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" event={"ID":"92316376-b91d-4e78-ac0c-6f03f1be5f26","Type":"ContainerStarted","Data":"ff57bddd5d6dd72154d11b408083a5c7b4d01302011eba5cd2b7364df89dfb6b"} Sep 29 10:19:40 crc kubenswrapper[4891]: I0929 10:19:40.899708 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" podStartSLOduration=2.3197373900000002 podStartE2EDuration="2.899683226s" podCreationTimestamp="2025-09-29 10:19:38 +0000 UTC" firstStartedPulling="2025-09-29 10:19:39.867575329 +0000 UTC m=+1910.072743650" lastFinishedPulling="2025-09-29 10:19:40.447521165 +0000 UTC m=+1910.652689486" observedRunningTime="2025-09-29 10:19:40.898570374 +0000 UTC m=+1911.103738715" watchObservedRunningTime="2025-09-29 10:19:40.899683226 +0000 UTC m=+1911.104851567" Sep 29 10:19:46 crc kubenswrapper[4891]: I0929 10:19:46.395710 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:19:46 crc kubenswrapper[4891]: I0929 10:19:46.939525 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf"} Sep 29 10:19:49 crc kubenswrapper[4891]: I0929 10:19:49.971241 4891 generic.go:334] "Generic (PLEG): container finished" podID="92316376-b91d-4e78-ac0c-6f03f1be5f26" containerID="776d9ffd6f6346a2ed14031766452d49033193ed4b71f4f578b146d97408a53d" exitCode=0 Sep 29 10:19:49 crc kubenswrapper[4891]: I0929 10:19:49.971308 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" event={"ID":"92316376-b91d-4e78-ac0c-6f03f1be5f26","Type":"ContainerDied","Data":"776d9ffd6f6346a2ed14031766452d49033193ed4b71f4f578b146d97408a53d"} Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.449401 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.468150 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key\") pod \"92316376-b91d-4e78-ac0c-6f03f1be5f26\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.522857 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "92316376-b91d-4e78-ac0c-6f03f1be5f26" (UID: "92316376-b91d-4e78-ac0c-6f03f1be5f26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.571842 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9zs\" (UniqueName: \"kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs\") pod \"92316376-b91d-4e78-ac0c-6f03f1be5f26\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.571893 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory\") pod \"92316376-b91d-4e78-ac0c-6f03f1be5f26\" (UID: \"92316376-b91d-4e78-ac0c-6f03f1be5f26\") " Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.572694 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.575319 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs" (OuterVolumeSpecName: "kube-api-access-hz9zs") pod "92316376-b91d-4e78-ac0c-6f03f1be5f26" (UID: "92316376-b91d-4e78-ac0c-6f03f1be5f26"). InnerVolumeSpecName "kube-api-access-hz9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.601443 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory" (OuterVolumeSpecName: "inventory") pod "92316376-b91d-4e78-ac0c-6f03f1be5f26" (UID: "92316376-b91d-4e78-ac0c-6f03f1be5f26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.675170 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz9zs\" (UniqueName: \"kubernetes.io/projected/92316376-b91d-4e78-ac0c-6f03f1be5f26-kube-api-access-hz9zs\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.675214 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92316376-b91d-4e78-ac0c-6f03f1be5f26-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.991925 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" event={"ID":"92316376-b91d-4e78-ac0c-6f03f1be5f26","Type":"ContainerDied","Data":"ff57bddd5d6dd72154d11b408083a5c7b4d01302011eba5cd2b7364df89dfb6b"} Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.992280 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff57bddd5d6dd72154d11b408083a5c7b4d01302011eba5cd2b7364df89dfb6b" Sep 29 10:19:51 crc kubenswrapper[4891]: I0929 10:19:51.991977 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2zlpr" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.121649 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5"] Sep 29 10:19:52 crc kubenswrapper[4891]: E0929 10:19:52.122120 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92316376-b91d-4e78-ac0c-6f03f1be5f26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.122139 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="92316376-b91d-4e78-ac0c-6f03f1be5f26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.122372 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="92316376-b91d-4e78-ac0c-6f03f1be5f26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.123180 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.127934 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.128340 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.128637 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.128923 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.152385 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5"] Sep 29 10:19:52 crc kubenswrapper[4891]: E0929 10:19:52.176213 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92316376_b91d_4e78_ac0c_6f03f1be5f26.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.281843 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.283515 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.283761 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsm45\" (UniqueName: \"kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.385376 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsm45\" (UniqueName: \"kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.385530 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.385565 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.400234 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.403334 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.412039 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsm45\" (UniqueName: \"kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:52 crc kubenswrapper[4891]: I0929 10:19:52.464207 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:19:53 crc kubenswrapper[4891]: I0929 10:19:53.026285 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5"] Sep 29 10:19:53 crc kubenswrapper[4891]: W0929 10:19:53.035432 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd0ca11c_98f5_4734_bc9a_fef72b1004f8.slice/crio-a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78 WatchSource:0}: Error finding container a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78: Status 404 returned error can't find the container with id a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78 Sep 29 10:19:54 crc kubenswrapper[4891]: I0929 10:19:54.015035 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" event={"ID":"cd0ca11c-98f5-4734-bc9a-fef72b1004f8","Type":"ContainerStarted","Data":"ca62e7900ff7ae789325e9ebd73d5b2a258d6f5c7b21452346a23c0003631ec1"} Sep 29 10:19:54 crc kubenswrapper[4891]: I0929 10:19:54.015320 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" event={"ID":"cd0ca11c-98f5-4734-bc9a-fef72b1004f8","Type":"ContainerStarted","Data":"a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78"} Sep 29 10:19:54 crc kubenswrapper[4891]: I0929 10:19:54.039133 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" podStartSLOduration=1.381985885 podStartE2EDuration="2.039106493s" podCreationTimestamp="2025-09-29 10:19:52 +0000 UTC" firstStartedPulling="2025-09-29 10:19:53.03738227 +0000 UTC m=+1923.242550581" lastFinishedPulling="2025-09-29 10:19:53.694502858 +0000 UTC m=+1923.899671189" observedRunningTime="2025-09-29 10:19:54.029928666 +0000 UTC m=+1924.235096997" watchObservedRunningTime="2025-09-29 10:19:54.039106493 +0000 UTC m=+1924.244274854" Sep 29 10:20:04 crc kubenswrapper[4891]: I0929 10:20:04.131722 4891 generic.go:334] "Generic (PLEG): container finished" podID="cd0ca11c-98f5-4734-bc9a-fef72b1004f8" containerID="ca62e7900ff7ae789325e9ebd73d5b2a258d6f5c7b21452346a23c0003631ec1" exitCode=0 Sep 29 10:20:04 crc kubenswrapper[4891]: I0929 10:20:04.131828 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" event={"ID":"cd0ca11c-98f5-4734-bc9a-fef72b1004f8","Type":"ContainerDied","Data":"ca62e7900ff7ae789325e9ebd73d5b2a258d6f5c7b21452346a23c0003631ec1"} Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.567542 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.682150 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key\") pod \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.682232 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory\") pod \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.682356 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsm45\" (UniqueName: \"kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45\") pod \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\" (UID: \"cd0ca11c-98f5-4734-bc9a-fef72b1004f8\") " Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.698036 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45" (OuterVolumeSpecName: "kube-api-access-vsm45") pod "cd0ca11c-98f5-4734-bc9a-fef72b1004f8" (UID: "cd0ca11c-98f5-4734-bc9a-fef72b1004f8"). InnerVolumeSpecName "kube-api-access-vsm45". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.715228 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory" (OuterVolumeSpecName: "inventory") pod "cd0ca11c-98f5-4734-bc9a-fef72b1004f8" (UID: "cd0ca11c-98f5-4734-bc9a-fef72b1004f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.720474 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd0ca11c-98f5-4734-bc9a-fef72b1004f8" (UID: "cd0ca11c-98f5-4734-bc9a-fef72b1004f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.785334 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.785371 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:05 crc kubenswrapper[4891]: I0929 10:20:05.785387 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsm45\" (UniqueName: \"kubernetes.io/projected/cd0ca11c-98f5-4734-bc9a-fef72b1004f8-kube-api-access-vsm45\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.158951 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" event={"ID":"cd0ca11c-98f5-4734-bc9a-fef72b1004f8","Type":"ContainerDied","Data":"a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78"} Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.159018 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f0032f53858e4e78eacbeed62ad7ffef796e9b4118061bc4f498dcd00b8b78" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.159069 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.249069 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k"] Sep 29 10:20:06 crc kubenswrapper[4891]: E0929 10:20:06.249551 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0ca11c-98f5-4734-bc9a-fef72b1004f8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.249576 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0ca11c-98f5-4734-bc9a-fef72b1004f8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.249783 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0ca11c-98f5-4734-bc9a-fef72b1004f8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.250478 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.253437 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.253628 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.253737 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.253924 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.253959 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.254434 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.255126 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.255720 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.257319 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k"] Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.300434 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.300868 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.300924 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301022 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301124 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wxj\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301206 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301295 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301596 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301670 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301738 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.301862 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.302120 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.302271 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.302306 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.404354 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.405058 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.405397 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.405728 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.406094 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.406336 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.406564 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.406966 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.407211 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.408023 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.408333 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.408767 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77wxj\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.409072 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.409521 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.409994 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.410279 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.410710 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.410812 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.414230 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.414862 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.416233 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.416527 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.417075 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.417463 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.420388 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.420914 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.422201 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.432994 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wxj\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:06 crc kubenswrapper[4891]: I0929 10:20:06.580001 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:07 crc kubenswrapper[4891]: I0929 10:20:07.120549 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k"] Sep 29 10:20:07 crc kubenswrapper[4891]: I0929 10:20:07.168307 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" event={"ID":"81439ac0-9a3d-434f-8122-90cc5eeeba97","Type":"ContainerStarted","Data":"fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a"} Sep 29 10:20:09 crc kubenswrapper[4891]: I0929 10:20:09.193815 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" event={"ID":"81439ac0-9a3d-434f-8122-90cc5eeeba97","Type":"ContainerStarted","Data":"e87a23e7ffae020b7107b3962a6bc5d6b18688bca5a6a2841fc9127e38ff41f0"} Sep 29 10:20:09 crc kubenswrapper[4891]: I0929 10:20:09.221121 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" podStartSLOduration=2.45169496 podStartE2EDuration="3.221103202s" podCreationTimestamp="2025-09-29 10:20:06 +0000 UTC" firstStartedPulling="2025-09-29 10:20:07.131907423 +0000 UTC m=+1937.337075744" lastFinishedPulling="2025-09-29 10:20:07.901315665 +0000 UTC m=+1938.106483986" observedRunningTime="2025-09-29 10:20:09.211380369 +0000 UTC m=+1939.416548700" watchObservedRunningTime="2025-09-29 10:20:09.221103202 +0000 UTC m=+1939.426271523" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.675893 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.680566 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.685876 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.705043 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.705196 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqhhj\" (UniqueName: \"kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.705246 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.806668 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.806834 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqhhj\" (UniqueName: \"kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.806877 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.807393 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.807842 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:34 crc kubenswrapper[4891]: I0929 10:20:34.833660 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqhhj\" (UniqueName: \"kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj\") pod \"redhat-operators-qpqb7\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:35 crc kubenswrapper[4891]: I0929 10:20:35.023950 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:35 crc kubenswrapper[4891]: I0929 10:20:35.504423 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:36 crc kubenswrapper[4891]: I0929 10:20:36.449009 4891 generic.go:334] "Generic (PLEG): container finished" podID="f660507c-938f-42ba-ab12-d021caa93a4f" containerID="403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27" exitCode=0 Sep 29 10:20:36 crc kubenswrapper[4891]: I0929 10:20:36.449068 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerDied","Data":"403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27"} Sep 29 10:20:36 crc kubenswrapper[4891]: I0929 10:20:36.449286 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerStarted","Data":"6e81a12cc69c300e29b845327d3db4e1a44552543a82b1f08cf474b54aa8db32"} Sep 29 10:20:38 crc kubenswrapper[4891]: I0929 10:20:38.486311 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerStarted","Data":"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a"} Sep 29 10:20:39 crc kubenswrapper[4891]: I0929 10:20:39.497110 4891 generic.go:334] "Generic (PLEG): container finished" podID="f660507c-938f-42ba-ab12-d021caa93a4f" containerID="02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a" exitCode=0 Sep 29 10:20:39 crc kubenswrapper[4891]: I0929 10:20:39.497171 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerDied","Data":"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a"} Sep 29 10:20:41 crc kubenswrapper[4891]: I0929 10:20:41.519163 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerStarted","Data":"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee"} Sep 29 10:20:41 crc kubenswrapper[4891]: I0929 10:20:41.546024 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qpqb7" podStartSLOduration=3.5589938439999997 podStartE2EDuration="7.545993005s" podCreationTimestamp="2025-09-29 10:20:34 +0000 UTC" firstStartedPulling="2025-09-29 10:20:36.451499846 +0000 UTC m=+1966.656668187" lastFinishedPulling="2025-09-29 10:20:40.438499027 +0000 UTC m=+1970.643667348" observedRunningTime="2025-09-29 10:20:41.539073323 +0000 UTC m=+1971.744241654" watchObservedRunningTime="2025-09-29 10:20:41.545993005 +0000 UTC m=+1971.751161326" Sep 29 10:20:45 crc kubenswrapper[4891]: I0929 10:20:45.024557 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:45 crc kubenswrapper[4891]: I0929 10:20:45.025260 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:45 crc kubenswrapper[4891]: I0929 10:20:45.076621 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:45 crc kubenswrapper[4891]: I0929 10:20:45.599069 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:45 crc kubenswrapper[4891]: I0929 10:20:45.686239 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:47 crc kubenswrapper[4891]: I0929 10:20:47.570300 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qpqb7" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="registry-server" containerID="cri-o://148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee" gracePeriod=2 Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.033855 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.189577 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqhhj\" (UniqueName: \"kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj\") pod \"f660507c-938f-42ba-ab12-d021caa93a4f\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.189630 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content\") pod \"f660507c-938f-42ba-ab12-d021caa93a4f\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.189730 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities\") pod \"f660507c-938f-42ba-ab12-d021caa93a4f\" (UID: \"f660507c-938f-42ba-ab12-d021caa93a4f\") " Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.190576 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities" (OuterVolumeSpecName: "utilities") pod "f660507c-938f-42ba-ab12-d021caa93a4f" (UID: "f660507c-938f-42ba-ab12-d021caa93a4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.195176 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj" (OuterVolumeSpecName: "kube-api-access-hqhhj") pod "f660507c-938f-42ba-ab12-d021caa93a4f" (UID: "f660507c-938f-42ba-ab12-d021caa93a4f"). InnerVolumeSpecName "kube-api-access-hqhhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.270719 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f660507c-938f-42ba-ab12-d021caa93a4f" (UID: "f660507c-938f-42ba-ab12-d021caa93a4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.291823 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqhhj\" (UniqueName: \"kubernetes.io/projected/f660507c-938f-42ba-ab12-d021caa93a4f-kube-api-access-hqhhj\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.291864 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.291873 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f660507c-938f-42ba-ab12-d021caa93a4f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.581897 4891 generic.go:334] "Generic (PLEG): container finished" podID="81439ac0-9a3d-434f-8122-90cc5eeeba97" containerID="e87a23e7ffae020b7107b3962a6bc5d6b18688bca5a6a2841fc9127e38ff41f0" exitCode=0 Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.582029 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" event={"ID":"81439ac0-9a3d-434f-8122-90cc5eeeba97","Type":"ContainerDied","Data":"e87a23e7ffae020b7107b3962a6bc5d6b18688bca5a6a2841fc9127e38ff41f0"} Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.587901 4891 generic.go:334] "Generic (PLEG): container finished" podID="f660507c-938f-42ba-ab12-d021caa93a4f" containerID="148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee" exitCode=0 Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.587961 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerDied","Data":"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee"} Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.588022 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpqb7" event={"ID":"f660507c-938f-42ba-ab12-d021caa93a4f","Type":"ContainerDied","Data":"6e81a12cc69c300e29b845327d3db4e1a44552543a82b1f08cf474b54aa8db32"} Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.588046 4891 scope.go:117] "RemoveContainer" containerID="148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.588515 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpqb7" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.615490 4891 scope.go:117] "RemoveContainer" containerID="02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.629835 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.642909 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qpqb7"] Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.647498 4891 scope.go:117] "RemoveContainer" containerID="403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.689572 4891 scope.go:117] "RemoveContainer" containerID="148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee" Sep 29 10:20:48 crc kubenswrapper[4891]: E0929 10:20:48.690094 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee\": container with ID starting with 148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee not found: ID does not exist" containerID="148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.690133 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee"} err="failed to get container status \"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee\": rpc error: code = NotFound desc = could not find container \"148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee\": container with ID starting with 148e57dad2b2b11ed521f207d05cba340e9c2a07d90cb66c979e8f9c4e1f7cee not found: ID does not exist" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.690161 4891 scope.go:117] "RemoveContainer" containerID="02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a" Sep 29 10:20:48 crc kubenswrapper[4891]: E0929 10:20:48.690773 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a\": container with ID starting with 02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a not found: ID does not exist" containerID="02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.690817 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a"} err="failed to get container status \"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a\": rpc error: code = NotFound desc = could not find container \"02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a\": container with ID starting with 02d0aff76a0de14d2b64261217fc23b041b3d69b9bc054ff1245dde1656e857a not found: ID does not exist" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.690835 4891 scope.go:117] "RemoveContainer" containerID="403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27" Sep 29 10:20:48 crc kubenswrapper[4891]: E0929 10:20:48.691131 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27\": container with ID starting with 403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27 not found: ID does not exist" containerID="403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27" Sep 29 10:20:48 crc kubenswrapper[4891]: I0929 10:20:48.691160 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27"} err="failed to get container status \"403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27\": rpc error: code = NotFound desc = could not find container \"403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27\": container with ID starting with 403df059ac71146858eb817f7da23fef945bdaee199ab49ac0fb261352bf3a27 not found: ID does not exist" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.053711 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.130907 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.130970 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131008 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131041 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131066 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131092 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131112 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.131164 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.137824 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.138412 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.138468 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.138832 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.139643 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.140923 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.175354 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.178970 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory" (OuterVolumeSpecName: "inventory") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232557 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232709 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232766 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232818 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232845 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77wxj\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.232878 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle\") pod \"81439ac0-9a3d-434f-8122-90cc5eeeba97\" (UID: \"81439ac0-9a3d-434f-8122-90cc5eeeba97\") " Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233212 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233259 4891 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233273 4891 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233285 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233297 4891 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233309 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233321 4891 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.233335 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.237436 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.237687 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.238253 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj" (OuterVolumeSpecName: "kube-api-access-77wxj") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "kube-api-access-77wxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.238563 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.239039 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.240086 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "81439ac0-9a3d-434f-8122-90cc5eeeba97" (UID: "81439ac0-9a3d-434f-8122-90cc5eeeba97"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335391 4891 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335435 4891 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335452 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335464 4891 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81439ac0-9a3d-434f-8122-90cc5eeeba97-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335478 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.335489 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77wxj\" (UniqueName: \"kubernetes.io/projected/81439ac0-9a3d-434f-8122-90cc5eeeba97-kube-api-access-77wxj\") on node \"crc\" DevicePath \"\"" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.408618 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" path="/var/lib/kubelet/pods/f660507c-938f-42ba-ab12-d021caa93a4f/volumes" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.611826 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" event={"ID":"81439ac0-9a3d-434f-8122-90cc5eeeba97","Type":"ContainerDied","Data":"fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a"} Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.611875 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.611943 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726343 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc"] Sep 29 10:20:50 crc kubenswrapper[4891]: E0929 10:20:50.726719 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81439ac0-9a3d-434f-8122-90cc5eeeba97" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726737 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="81439ac0-9a3d-434f-8122-90cc5eeeba97" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:50 crc kubenswrapper[4891]: E0929 10:20:50.726749 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="registry-server" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726756 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="registry-server" Sep 29 10:20:50 crc kubenswrapper[4891]: E0929 10:20:50.726781 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="extract-utilities" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726801 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="extract-utilities" Sep 29 10:20:50 crc kubenswrapper[4891]: E0929 10:20:50.726819 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="extract-content" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726825 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="extract-content" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.726999 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="81439ac0-9a3d-434f-8122-90cc5eeeba97" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.727016 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f660507c-938f-42ba-ab12-d021caa93a4f" containerName="registry-server" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.727673 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.732140 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.732377 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.732525 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.732686 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.736095 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.740316 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc"] Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.742235 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.742283 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.742332 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7778\" (UniqueName: \"kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.742364 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.742410 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.844379 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.844467 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.844583 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.844613 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.844666 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7778\" (UniqueName: \"kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.846072 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.849407 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.849613 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.849778 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:50 crc kubenswrapper[4891]: I0929 10:20:50.862575 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7778\" (UniqueName: \"kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frbbc\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:51 crc kubenswrapper[4891]: I0929 10:20:51.043626 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:20:51 crc kubenswrapper[4891]: I0929 10:20:51.590950 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc"] Sep 29 10:20:51 crc kubenswrapper[4891]: I0929 10:20:51.625572 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" event={"ID":"1dc04648-883f-4273-bf36-d550e5caba61","Type":"ContainerStarted","Data":"f553c846e73b036229504b93de831525648ac38690c1176e9875c0f51fc4e134"} Sep 29 10:20:52 crc kubenswrapper[4891]: I0929 10:20:52.657663 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" podStartSLOduration=1.9364641219999998 podStartE2EDuration="2.657643957s" podCreationTimestamp="2025-09-29 10:20:50 +0000 UTC" firstStartedPulling="2025-09-29 10:20:51.59922874 +0000 UTC m=+1981.804397061" lastFinishedPulling="2025-09-29 10:20:52.320408535 +0000 UTC m=+1982.525576896" observedRunningTime="2025-09-29 10:20:52.655216026 +0000 UTC m=+1982.860384377" watchObservedRunningTime="2025-09-29 10:20:52.657643957 +0000 UTC m=+1982.862812288" Sep 29 10:20:53 crc kubenswrapper[4891]: I0929 10:20:53.649404 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" event={"ID":"1dc04648-883f-4273-bf36-d550e5caba61","Type":"ContainerStarted","Data":"8c09bf927e886f1a53284b7a653660e424adaf46669b42074b350683c913f309"} Sep 29 10:20:53 crc kubenswrapper[4891]: E0929 10:20:53.731399 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:03 crc kubenswrapper[4891]: E0929 10:21:03.977308 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:14 crc kubenswrapper[4891]: E0929 10:21:14.216277 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:24 crc kubenswrapper[4891]: E0929 10:21:24.487893 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:34 crc kubenswrapper[4891]: E0929 10:21:34.741510 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:44 crc kubenswrapper[4891]: E0929 10:21:44.994879 4891 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81439ac0_9a3d_434f_8122_90cc5eeeba97.slice/crio-fad96173cbdd794df44e5288ec46b2e71c7d8b86306bfac4292c76fd874f941a\": RecentStats: unable to find data in memory cache]" Sep 29 10:21:57 crc kubenswrapper[4891]: I0929 10:21:57.284933 4891 generic.go:334] "Generic (PLEG): container finished" podID="1dc04648-883f-4273-bf36-d550e5caba61" containerID="8c09bf927e886f1a53284b7a653660e424adaf46669b42074b350683c913f309" exitCode=0 Sep 29 10:21:57 crc kubenswrapper[4891]: I0929 10:21:57.285038 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" event={"ID":"1dc04648-883f-4273-bf36-d550e5caba61","Type":"ContainerDied","Data":"8c09bf927e886f1a53284b7a653660e424adaf46669b42074b350683c913f309"} Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.744573 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.772357 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7778\" (UniqueName: \"kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778\") pod \"1dc04648-883f-4273-bf36-d550e5caba61\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.772428 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key\") pod \"1dc04648-883f-4273-bf36-d550e5caba61\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.772507 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle\") pod \"1dc04648-883f-4273-bf36-d550e5caba61\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.772540 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory\") pod \"1dc04648-883f-4273-bf36-d550e5caba61\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.772610 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0\") pod \"1dc04648-883f-4273-bf36-d550e5caba61\" (UID: \"1dc04648-883f-4273-bf36-d550e5caba61\") " Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.781268 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778" (OuterVolumeSpecName: "kube-api-access-v7778") pod "1dc04648-883f-4273-bf36-d550e5caba61" (UID: "1dc04648-883f-4273-bf36-d550e5caba61"). InnerVolumeSpecName "kube-api-access-v7778". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.783112 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1dc04648-883f-4273-bf36-d550e5caba61" (UID: "1dc04648-883f-4273-bf36-d550e5caba61"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.804388 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory" (OuterVolumeSpecName: "inventory") pod "1dc04648-883f-4273-bf36-d550e5caba61" (UID: "1dc04648-883f-4273-bf36-d550e5caba61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.811643 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1dc04648-883f-4273-bf36-d550e5caba61" (UID: "1dc04648-883f-4273-bf36-d550e5caba61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.813257 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1dc04648-883f-4273-bf36-d550e5caba61" (UID: "1dc04648-883f-4273-bf36-d550e5caba61"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.875461 4891 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.875543 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.875570 4891 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc04648-883f-4273-bf36-d550e5caba61-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.875595 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7778\" (UniqueName: \"kubernetes.io/projected/1dc04648-883f-4273-bf36-d550e5caba61-kube-api-access-v7778\") on node \"crc\" DevicePath \"\"" Sep 29 10:21:58 crc kubenswrapper[4891]: I0929 10:21:58.875620 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc04648-883f-4273-bf36-d550e5caba61-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.312340 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" event={"ID":"1dc04648-883f-4273-bf36-d550e5caba61","Type":"ContainerDied","Data":"f553c846e73b036229504b93de831525648ac38690c1176e9875c0f51fc4e134"} Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.312422 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f553c846e73b036229504b93de831525648ac38690c1176e9875c0f51fc4e134" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.312454 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frbbc" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.429544 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7"] Sep 29 10:21:59 crc kubenswrapper[4891]: E0929 10:21:59.430049 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc04648-883f-4273-bf36-d550e5caba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.430069 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc04648-883f-4273-bf36-d550e5caba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.430293 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc04648-883f-4273-bf36-d550e5caba61" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.431037 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.435692 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.435973 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.435993 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.436438 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.436836 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.440461 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.450299 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7"] Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.487898 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.487986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.488014 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqc8\" (UniqueName: \"kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.488187 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.488282 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.488484 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.589619 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.590226 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.590350 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqc8\" (UniqueName: \"kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.590584 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.590698 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.590854 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.594806 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.595230 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.595838 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.603955 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.604203 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.611166 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqc8\" (UniqueName: \"kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:21:59 crc kubenswrapper[4891]: I0929 10:21:59.753309 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:22:00 crc kubenswrapper[4891]: I0929 10:22:00.115884 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7"] Sep 29 10:22:00 crc kubenswrapper[4891]: I0929 10:22:00.324218 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" event={"ID":"cc623a81-2fe0-42a2-8f61-cb9ab6909984","Type":"ContainerStarted","Data":"0f4ed1ea3b389b075200e68c9d4c2a5de9d7d6492836b560089aab2c129edf5e"} Sep 29 10:22:01 crc kubenswrapper[4891]: I0929 10:22:01.333700 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" event={"ID":"cc623a81-2fe0-42a2-8f61-cb9ab6909984","Type":"ContainerStarted","Data":"157d8478502a4f611feabe4e4f2c653303d394354d48cd03506d767dc7f8e848"} Sep 29 10:22:01 crc kubenswrapper[4891]: I0929 10:22:01.356331 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" podStartSLOduration=1.744541256 podStartE2EDuration="2.356312041s" podCreationTimestamp="2025-09-29 10:21:59 +0000 UTC" firstStartedPulling="2025-09-29 10:22:00.123187121 +0000 UTC m=+2050.328355442" lastFinishedPulling="2025-09-29 10:22:00.734957896 +0000 UTC m=+2050.940126227" observedRunningTime="2025-09-29 10:22:01.347310239 +0000 UTC m=+2051.552478560" watchObservedRunningTime="2025-09-29 10:22:01.356312041 +0000 UTC m=+2051.561480362" Sep 29 10:22:06 crc kubenswrapper[4891]: I0929 10:22:06.185591 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:22:06 crc kubenswrapper[4891]: I0929 10:22:06.186188 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:22:36 crc kubenswrapper[4891]: I0929 10:22:36.185896 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:22:36 crc kubenswrapper[4891]: I0929 10:22:36.186848 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.684833 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.687656 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.705041 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.745844 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.745890 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszxt\" (UniqueName: \"kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.745936 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.847613 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.847807 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.847837 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lszxt\" (UniqueName: \"kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.848206 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.848276 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:48 crc kubenswrapper[4891]: I0929 10:22:48.865647 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszxt\" (UniqueName: \"kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt\") pod \"redhat-marketplace-wpzhq\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:49 crc kubenswrapper[4891]: I0929 10:22:49.015887 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:49 crc kubenswrapper[4891]: I0929 10:22:49.532918 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:22:49 crc kubenswrapper[4891]: I0929 10:22:49.811113 4891 generic.go:334] "Generic (PLEG): container finished" podID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerID="94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9" exitCode=0 Sep 29 10:22:49 crc kubenswrapper[4891]: I0929 10:22:49.811248 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerDied","Data":"94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9"} Sep 29 10:22:49 crc kubenswrapper[4891]: I0929 10:22:49.811492 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerStarted","Data":"c5c4b160eb77578814b1a25c195e7afc28aee545a4d1a2ddb7e70ec34be00746"} Sep 29 10:22:50 crc kubenswrapper[4891]: I0929 10:22:50.826519 4891 generic.go:334] "Generic (PLEG): container finished" podID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerID="7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d" exitCode=0 Sep 29 10:22:50 crc kubenswrapper[4891]: I0929 10:22:50.826624 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerDied","Data":"7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d"} Sep 29 10:22:51 crc kubenswrapper[4891]: I0929 10:22:51.838487 4891 generic.go:334] "Generic (PLEG): container finished" podID="cc623a81-2fe0-42a2-8f61-cb9ab6909984" containerID="157d8478502a4f611feabe4e4f2c653303d394354d48cd03506d767dc7f8e848" exitCode=0 Sep 29 10:22:51 crc kubenswrapper[4891]: I0929 10:22:51.838564 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" event={"ID":"cc623a81-2fe0-42a2-8f61-cb9ab6909984","Type":"ContainerDied","Data":"157d8478502a4f611feabe4e4f2c653303d394354d48cd03506d767dc7f8e848"} Sep 29 10:22:51 crc kubenswrapper[4891]: I0929 10:22:51.842373 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerStarted","Data":"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d"} Sep 29 10:22:51 crc kubenswrapper[4891]: I0929 10:22:51.895475 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpzhq" podStartSLOduration=2.48545428 podStartE2EDuration="3.895456176s" podCreationTimestamp="2025-09-29 10:22:48 +0000 UTC" firstStartedPulling="2025-09-29 10:22:49.812525641 +0000 UTC m=+2100.017693962" lastFinishedPulling="2025-09-29 10:22:51.222527537 +0000 UTC m=+2101.427695858" observedRunningTime="2025-09-29 10:22:51.888985047 +0000 UTC m=+2102.094153378" watchObservedRunningTime="2025-09-29 10:22:51.895456176 +0000 UTC m=+2102.100624517" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.232429 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356444 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356566 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356652 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtqc8\" (UniqueName: \"kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356718 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356758 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.356833 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key\") pod \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\" (UID: \"cc623a81-2fe0-42a2-8f61-cb9ab6909984\") " Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.362864 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8" (OuterVolumeSpecName: "kube-api-access-mtqc8") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "kube-api-access-mtqc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.364713 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.387227 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.390165 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory" (OuterVolumeSpecName: "inventory") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.394956 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.396256 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cc623a81-2fe0-42a2-8f61-cb9ab6909984" (UID: "cc623a81-2fe0-42a2-8f61-cb9ab6909984"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.458950 4891 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.459482 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtqc8\" (UniqueName: \"kubernetes.io/projected/cc623a81-2fe0-42a2-8f61-cb9ab6909984-kube-api-access-mtqc8\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.459511 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.459527 4891 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.459540 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.459552 4891 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cc623a81-2fe0-42a2-8f61-cb9ab6909984-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.860971 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" event={"ID":"cc623a81-2fe0-42a2-8f61-cb9ab6909984","Type":"ContainerDied","Data":"0f4ed1ea3b389b075200e68c9d4c2a5de9d7d6492836b560089aab2c129edf5e"} Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.861016 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4ed1ea3b389b075200e68c9d4c2a5de9d7d6492836b560089aab2c129edf5e" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.861060 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.963680 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5"] Sep 29 10:22:53 crc kubenswrapper[4891]: E0929 10:22:53.964116 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc623a81-2fe0-42a2-8f61-cb9ab6909984" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.964134 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc623a81-2fe0-42a2-8f61-cb9ab6909984" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.964310 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc623a81-2fe0-42a2-8f61-cb9ab6909984" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.964980 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.969590 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.970102 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.973924 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.974201 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.976759 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:22:53 crc kubenswrapper[4891]: I0929 10:22:53.977076 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5"] Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.069585 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.070016 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.070070 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.070092 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wvvn\" (UniqueName: \"kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.070309 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.172049 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.172131 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.172183 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.172201 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wvvn\" (UniqueName: \"kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.172239 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.177457 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.177508 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.177596 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.178664 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.188677 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wvvn\" (UniqueName: \"kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-l85z5\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.299806 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:22:54 crc kubenswrapper[4891]: I0929 10:22:54.868503 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5"] Sep 29 10:22:54 crc kubenswrapper[4891]: W0929 10:22:54.877972 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5239b3_e586_4a56_89ce_74977f3509db.slice/crio-28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e WatchSource:0}: Error finding container 28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e: Status 404 returned error can't find the container with id 28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e Sep 29 10:22:55 crc kubenswrapper[4891]: I0929 10:22:55.888455 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" event={"ID":"ed5239b3-e586-4a56-89ce-74977f3509db","Type":"ContainerStarted","Data":"7a6fd8ce37c0b51978f17eaedb6d1b89a7c8fb02a4666d34dd14d4748fe12360"} Sep 29 10:22:55 crc kubenswrapper[4891]: I0929 10:22:55.888892 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" event={"ID":"ed5239b3-e586-4a56-89ce-74977f3509db","Type":"ContainerStarted","Data":"28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e"} Sep 29 10:22:55 crc kubenswrapper[4891]: I0929 10:22:55.924245 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" podStartSLOduration=2.465635089 podStartE2EDuration="2.924213798s" podCreationTimestamp="2025-09-29 10:22:53 +0000 UTC" firstStartedPulling="2025-09-29 10:22:54.881101478 +0000 UTC m=+2105.086269809" lastFinishedPulling="2025-09-29 10:22:55.339680187 +0000 UTC m=+2105.544848518" observedRunningTime="2025-09-29 10:22:55.907736908 +0000 UTC m=+2106.112905309" watchObservedRunningTime="2025-09-29 10:22:55.924213798 +0000 UTC m=+2106.129382159" Sep 29 10:22:59 crc kubenswrapper[4891]: I0929 10:22:59.016066 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:59 crc kubenswrapper[4891]: I0929 10:22:59.017013 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:59 crc kubenswrapper[4891]: I0929 10:22:59.072461 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:22:59 crc kubenswrapper[4891]: I0929 10:22:59.991687 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:23:00 crc kubenswrapper[4891]: I0929 10:23:00.048437 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:23:01 crc kubenswrapper[4891]: I0929 10:23:01.949691 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpzhq" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="registry-server" containerID="cri-o://aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d" gracePeriod=2 Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.428957 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.546455 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content\") pod \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.546837 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities\") pod \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.546890 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lszxt\" (UniqueName: \"kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt\") pod \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\" (UID: \"f9cd8fe3-e442-4d93-ad5f-1516016a67e6\") " Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.547736 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities" (OuterVolumeSpecName: "utilities") pod "f9cd8fe3-e442-4d93-ad5f-1516016a67e6" (UID: "f9cd8fe3-e442-4d93-ad5f-1516016a67e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.553146 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt" (OuterVolumeSpecName: "kube-api-access-lszxt") pod "f9cd8fe3-e442-4d93-ad5f-1516016a67e6" (UID: "f9cd8fe3-e442-4d93-ad5f-1516016a67e6"). InnerVolumeSpecName "kube-api-access-lszxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.560846 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9cd8fe3-e442-4d93-ad5f-1516016a67e6" (UID: "f9cd8fe3-e442-4d93-ad5f-1516016a67e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.648997 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.649026 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.649036 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lszxt\" (UniqueName: \"kubernetes.io/projected/f9cd8fe3-e442-4d93-ad5f-1516016a67e6-kube-api-access-lszxt\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.961801 4891 generic.go:334] "Generic (PLEG): container finished" podID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerID="aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d" exitCode=0 Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.961854 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerDied","Data":"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d"} Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.961896 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpzhq" event={"ID":"f9cd8fe3-e442-4d93-ad5f-1516016a67e6","Type":"ContainerDied","Data":"c5c4b160eb77578814b1a25c195e7afc28aee545a4d1a2ddb7e70ec34be00746"} Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.961916 4891 scope.go:117] "RemoveContainer" containerID="aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d" Sep 29 10:23:02 crc kubenswrapper[4891]: I0929 10:23:02.961968 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpzhq" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.005185 4891 scope.go:117] "RemoveContainer" containerID="7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.025225 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.041101 4891 scope.go:117] "RemoveContainer" containerID="94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.044859 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpzhq"] Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.081123 4891 scope.go:117] "RemoveContainer" containerID="aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d" Sep 29 10:23:03 crc kubenswrapper[4891]: E0929 10:23:03.081845 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d\": container with ID starting with aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d not found: ID does not exist" containerID="aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.081902 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d"} err="failed to get container status \"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d\": rpc error: code = NotFound desc = could not find container \"aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d\": container with ID starting with aca12b9f173a45690eb2bbe5baa3379d98c886b3e464448c8e3393c52b992b6d not found: ID does not exist" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.081936 4891 scope.go:117] "RemoveContainer" containerID="7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d" Sep 29 10:23:03 crc kubenswrapper[4891]: E0929 10:23:03.082468 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d\": container with ID starting with 7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d not found: ID does not exist" containerID="7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.082501 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d"} err="failed to get container status \"7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d\": rpc error: code = NotFound desc = could not find container \"7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d\": container with ID starting with 7b531780493922d1d4918b39771f080d56a77f1c3bf404d0ad444919794e088d not found: ID does not exist" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.082523 4891 scope.go:117] "RemoveContainer" containerID="94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9" Sep 29 10:23:03 crc kubenswrapper[4891]: E0929 10:23:03.082835 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9\": container with ID starting with 94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9 not found: ID does not exist" containerID="94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9" Sep 29 10:23:03 crc kubenswrapper[4891]: I0929 10:23:03.082864 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9"} err="failed to get container status \"94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9\": rpc error: code = NotFound desc = could not find container \"94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9\": container with ID starting with 94710ea3078c2046cc9f5041a4a039605754c4f8188efaf922bfc4c632b510a9 not found: ID does not exist" Sep 29 10:23:04 crc kubenswrapper[4891]: I0929 10:23:04.416661 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" path="/var/lib/kubelet/pods/f9cd8fe3-e442-4d93-ad5f-1516016a67e6/volumes" Sep 29 10:23:06 crc kubenswrapper[4891]: I0929 10:23:06.186195 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:23:06 crc kubenswrapper[4891]: I0929 10:23:06.186273 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:23:06 crc kubenswrapper[4891]: I0929 10:23:06.186325 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:23:06 crc kubenswrapper[4891]: I0929 10:23:06.187280 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:23:06 crc kubenswrapper[4891]: I0929 10:23:06.187349 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf" gracePeriod=600 Sep 29 10:23:07 crc kubenswrapper[4891]: I0929 10:23:07.009349 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf" exitCode=0 Sep 29 10:23:07 crc kubenswrapper[4891]: I0929 10:23:07.009896 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf"} Sep 29 10:23:07 crc kubenswrapper[4891]: I0929 10:23:07.009938 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece"} Sep 29 10:23:07 crc kubenswrapper[4891]: I0929 10:23:07.009966 4891 scope.go:117] "RemoveContainer" containerID="4313125f7b96347732fb8b862a6873ee3416bf3f15a7ccec31c8a868b4241ad5" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.393146 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:29 crc kubenswrapper[4891]: E0929 10:23:29.394115 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="extract-content" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.394130 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="extract-content" Sep 29 10:23:29 crc kubenswrapper[4891]: E0929 10:23:29.394186 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="registry-server" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.394194 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="registry-server" Sep 29 10:23:29 crc kubenswrapper[4891]: E0929 10:23:29.394210 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="extract-utilities" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.394219 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="extract-utilities" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.394476 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cd8fe3-e442-4d93-ad5f-1516016a67e6" containerName="registry-server" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.396470 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.409477 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.477970 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfpb\" (UniqueName: \"kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.478050 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.478097 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.581373 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfpb\" (UniqueName: \"kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.581476 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.581528 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.582302 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.582419 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.600238 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfpb\" (UniqueName: \"kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb\") pod \"community-operators-4cmsh\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:29 crc kubenswrapper[4891]: I0929 10:23:29.716568 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:30 crc kubenswrapper[4891]: I0929 10:23:30.239364 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:30 crc kubenswrapper[4891]: W0929 10:23:30.245647 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c57787a_9e94_42b9_a082_3b17f4358d96.slice/crio-875b8ff900ed8a822a24376d45bdda8e0e76d5479e1d0eaf76b40481fac19e64 WatchSource:0}: Error finding container 875b8ff900ed8a822a24376d45bdda8e0e76d5479e1d0eaf76b40481fac19e64: Status 404 returned error can't find the container with id 875b8ff900ed8a822a24376d45bdda8e0e76d5479e1d0eaf76b40481fac19e64 Sep 29 10:23:31 crc kubenswrapper[4891]: I0929 10:23:31.236815 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerID="29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e" exitCode=0 Sep 29 10:23:31 crc kubenswrapper[4891]: I0929 10:23:31.236906 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerDied","Data":"29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e"} Sep 29 10:23:31 crc kubenswrapper[4891]: I0929 10:23:31.238306 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerStarted","Data":"875b8ff900ed8a822a24376d45bdda8e0e76d5479e1d0eaf76b40481fac19e64"} Sep 29 10:23:32 crc kubenswrapper[4891]: I0929 10:23:32.252348 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerStarted","Data":"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d"} Sep 29 10:23:33 crc kubenswrapper[4891]: I0929 10:23:33.263576 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerID="a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d" exitCode=0 Sep 29 10:23:33 crc kubenswrapper[4891]: I0929 10:23:33.263643 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerDied","Data":"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d"} Sep 29 10:23:34 crc kubenswrapper[4891]: I0929 10:23:34.276109 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerStarted","Data":"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453"} Sep 29 10:23:34 crc kubenswrapper[4891]: I0929 10:23:34.298361 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4cmsh" podStartSLOduration=2.773589158 podStartE2EDuration="5.298338893s" podCreationTimestamp="2025-09-29 10:23:29 +0000 UTC" firstStartedPulling="2025-09-29 10:23:31.240920959 +0000 UTC m=+2141.446089280" lastFinishedPulling="2025-09-29 10:23:33.765670704 +0000 UTC m=+2143.970839015" observedRunningTime="2025-09-29 10:23:34.294484801 +0000 UTC m=+2144.499653152" watchObservedRunningTime="2025-09-29 10:23:34.298338893 +0000 UTC m=+2144.503507234" Sep 29 10:23:39 crc kubenswrapper[4891]: I0929 10:23:39.716911 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:39 crc kubenswrapper[4891]: I0929 10:23:39.717103 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:39 crc kubenswrapper[4891]: I0929 10:23:39.762716 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:40 crc kubenswrapper[4891]: I0929 10:23:40.404924 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:40 crc kubenswrapper[4891]: I0929 10:23:40.464865 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.358708 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4cmsh" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="registry-server" containerID="cri-o://f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453" gracePeriod=2 Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.798867 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.930912 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content\") pod \"5c57787a-9e94-42b9-a082-3b17f4358d96\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.931000 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities\") pod \"5c57787a-9e94-42b9-a082-3b17f4358d96\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.931134 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfpb\" (UniqueName: \"kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb\") pod \"5c57787a-9e94-42b9-a082-3b17f4358d96\" (UID: \"5c57787a-9e94-42b9-a082-3b17f4358d96\") " Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.932418 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities" (OuterVolumeSpecName: "utilities") pod "5c57787a-9e94-42b9-a082-3b17f4358d96" (UID: "5c57787a-9e94-42b9-a082-3b17f4358d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:23:42 crc kubenswrapper[4891]: I0929 10:23:42.937172 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb" (OuterVolumeSpecName: "kube-api-access-5qfpb") pod "5c57787a-9e94-42b9-a082-3b17f4358d96" (UID: "5c57787a-9e94-42b9-a082-3b17f4358d96"). InnerVolumeSpecName "kube-api-access-5qfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.033091 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfpb\" (UniqueName: \"kubernetes.io/projected/5c57787a-9e94-42b9-a082-3b17f4358d96-kube-api-access-5qfpb\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.033128 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.119779 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c57787a-9e94-42b9-a082-3b17f4358d96" (UID: "5c57787a-9e94-42b9-a082-3b17f4358d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.136236 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c57787a-9e94-42b9-a082-3b17f4358d96-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.371733 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerID="f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453" exitCode=0 Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.371818 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cmsh" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.373668 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerDied","Data":"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453"} Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.374052 4891 scope.go:117] "RemoveContainer" containerID="f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.375130 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cmsh" event={"ID":"5c57787a-9e94-42b9-a082-3b17f4358d96","Type":"ContainerDied","Data":"875b8ff900ed8a822a24376d45bdda8e0e76d5479e1d0eaf76b40481fac19e64"} Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.400753 4891 scope.go:117] "RemoveContainer" containerID="a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.414102 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.425609 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4cmsh"] Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.444144 4891 scope.go:117] "RemoveContainer" containerID="29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.475420 4891 scope.go:117] "RemoveContainer" containerID="f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453" Sep 29 10:23:43 crc kubenswrapper[4891]: E0929 10:23:43.475973 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453\": container with ID starting with f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453 not found: ID does not exist" containerID="f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.476032 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453"} err="failed to get container status \"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453\": rpc error: code = NotFound desc = could not find container \"f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453\": container with ID starting with f4d923cb0388e7b8ee2c0b13bd4130bd69378361653873536cb4f8228f0a9453 not found: ID does not exist" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.476064 4891 scope.go:117] "RemoveContainer" containerID="a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d" Sep 29 10:23:43 crc kubenswrapper[4891]: E0929 10:23:43.476418 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d\": container with ID starting with a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d not found: ID does not exist" containerID="a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.476450 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d"} err="failed to get container status \"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d\": rpc error: code = NotFound desc = could not find container \"a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d\": container with ID starting with a1e10971d8c39ec3da630d352d93eba247f2807c4ca94a00fbe2cd2f6be6201d not found: ID does not exist" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.476468 4891 scope.go:117] "RemoveContainer" containerID="29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e" Sep 29 10:23:43 crc kubenswrapper[4891]: E0929 10:23:43.476923 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e\": container with ID starting with 29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e not found: ID does not exist" containerID="29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e" Sep 29 10:23:43 crc kubenswrapper[4891]: I0929 10:23:43.476951 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e"} err="failed to get container status \"29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e\": rpc error: code = NotFound desc = could not find container \"29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e\": container with ID starting with 29d64ccb4afc4de288575fd7f138071e6d967cedfa637812a18ac2ab7d2ea86e not found: ID does not exist" Sep 29 10:23:44 crc kubenswrapper[4891]: I0929 10:23:44.417212 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" path="/var/lib/kubelet/pods/5c57787a-9e94-42b9-a082-3b17f4358d96/volumes" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.891196 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:23:47 crc kubenswrapper[4891]: E0929 10:23:47.892045 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="registry-server" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.892059 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="registry-server" Sep 29 10:23:47 crc kubenswrapper[4891]: E0929 10:23:47.892082 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="extract-content" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.892088 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="extract-content" Sep 29 10:23:47 crc kubenswrapper[4891]: E0929 10:23:47.892105 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="extract-utilities" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.892112 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="extract-utilities" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.892316 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c57787a-9e94-42b9-a082-3b17f4358d96" containerName="registry-server" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.896243 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:47 crc kubenswrapper[4891]: I0929 10:23:47.902727 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.030507 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.030846 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg947\" (UniqueName: \"kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.030957 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.132686 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg947\" (UniqueName: \"kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.132756 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.132906 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.133585 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.133768 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.154815 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg947\" (UniqueName: \"kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947\") pod \"certified-operators-7mxd2\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.229502 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:48 crc kubenswrapper[4891]: I0929 10:23:48.527285 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:23:49 crc kubenswrapper[4891]: I0929 10:23:49.450332 4891 generic.go:334] "Generic (PLEG): container finished" podID="57924cea-825a-49d0-baee-d97c67037eef" containerID="00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef" exitCode=0 Sep 29 10:23:49 crc kubenswrapper[4891]: I0929 10:23:49.450427 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerDied","Data":"00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef"} Sep 29 10:23:49 crc kubenswrapper[4891]: I0929 10:23:49.451926 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerStarted","Data":"eec3eaf5edb7d21b12765f901dd003f6a71b0da76c801fdce9527d87b6c8bbe3"} Sep 29 10:23:51 crc kubenswrapper[4891]: I0929 10:23:51.473361 4891 generic.go:334] "Generic (PLEG): container finished" podID="57924cea-825a-49d0-baee-d97c67037eef" containerID="7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09" exitCode=0 Sep 29 10:23:51 crc kubenswrapper[4891]: I0929 10:23:51.473426 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerDied","Data":"7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09"} Sep 29 10:23:52 crc kubenswrapper[4891]: I0929 10:23:52.485643 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerStarted","Data":"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859"} Sep 29 10:23:52 crc kubenswrapper[4891]: I0929 10:23:52.505850 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7mxd2" podStartSLOduration=2.93658506 podStartE2EDuration="5.505831322s" podCreationTimestamp="2025-09-29 10:23:47 +0000 UTC" firstStartedPulling="2025-09-29 10:23:49.453334442 +0000 UTC m=+2159.658502773" lastFinishedPulling="2025-09-29 10:23:52.022580714 +0000 UTC m=+2162.227749035" observedRunningTime="2025-09-29 10:23:52.503139394 +0000 UTC m=+2162.708307755" watchObservedRunningTime="2025-09-29 10:23:52.505831322 +0000 UTC m=+2162.710999643" Sep 29 10:23:58 crc kubenswrapper[4891]: I0929 10:23:58.230555 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:58 crc kubenswrapper[4891]: I0929 10:23:58.231141 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:58 crc kubenswrapper[4891]: I0929 10:23:58.272811 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:23:58 crc kubenswrapper[4891]: I0929 10:23:58.593331 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:24:04 crc kubenswrapper[4891]: I0929 10:24:04.672351 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:24:04 crc kubenswrapper[4891]: I0929 10:24:04.672944 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7mxd2" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="registry-server" containerID="cri-o://5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859" gracePeriod=2 Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.120876 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.280085 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg947\" (UniqueName: \"kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947\") pod \"57924cea-825a-49d0-baee-d97c67037eef\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.281326 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content\") pod \"57924cea-825a-49d0-baee-d97c67037eef\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.281407 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities\") pod \"57924cea-825a-49d0-baee-d97c67037eef\" (UID: \"57924cea-825a-49d0-baee-d97c67037eef\") " Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.283364 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities" (OuterVolumeSpecName: "utilities") pod "57924cea-825a-49d0-baee-d97c67037eef" (UID: "57924cea-825a-49d0-baee-d97c67037eef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.301751 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947" (OuterVolumeSpecName: "kube-api-access-zg947") pod "57924cea-825a-49d0-baee-d97c67037eef" (UID: "57924cea-825a-49d0-baee-d97c67037eef"). InnerVolumeSpecName "kube-api-access-zg947". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.330223 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57924cea-825a-49d0-baee-d97c67037eef" (UID: "57924cea-825a-49d0-baee-d97c67037eef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.383816 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg947\" (UniqueName: \"kubernetes.io/projected/57924cea-825a-49d0-baee-d97c67037eef-kube-api-access-zg947\") on node \"crc\" DevicePath \"\"" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.383855 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.383865 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57924cea-825a-49d0-baee-d97c67037eef-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.616062 4891 generic.go:334] "Generic (PLEG): container finished" podID="57924cea-825a-49d0-baee-d97c67037eef" containerID="5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859" exitCode=0 Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.616137 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mxd2" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.616182 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerDied","Data":"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859"} Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.617198 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mxd2" event={"ID":"57924cea-825a-49d0-baee-d97c67037eef","Type":"ContainerDied","Data":"eec3eaf5edb7d21b12765f901dd003f6a71b0da76c801fdce9527d87b6c8bbe3"} Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.617239 4891 scope.go:117] "RemoveContainer" containerID="5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.651374 4891 scope.go:117] "RemoveContainer" containerID="7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.676762 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.684671 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7mxd2"] Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.690948 4891 scope.go:117] "RemoveContainer" containerID="00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.735368 4891 scope.go:117] "RemoveContainer" containerID="5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859" Sep 29 10:24:05 crc kubenswrapper[4891]: E0929 10:24:05.735806 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859\": container with ID starting with 5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859 not found: ID does not exist" containerID="5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.735840 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859"} err="failed to get container status \"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859\": rpc error: code = NotFound desc = could not find container \"5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859\": container with ID starting with 5e6e6f3d928f7d056f76fe16724b676f053b8be22057fcea0aae8adc6835e859 not found: ID does not exist" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.735861 4891 scope.go:117] "RemoveContainer" containerID="7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09" Sep 29 10:24:05 crc kubenswrapper[4891]: E0929 10:24:05.736201 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09\": container with ID starting with 7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09 not found: ID does not exist" containerID="7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.736294 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09"} err="failed to get container status \"7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09\": rpc error: code = NotFound desc = could not find container \"7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09\": container with ID starting with 7eab1e03b023a94031cd4a1bbe6c798082d2c0e75aa6c75bd06445c932980a09 not found: ID does not exist" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.736369 4891 scope.go:117] "RemoveContainer" containerID="00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef" Sep 29 10:24:05 crc kubenswrapper[4891]: E0929 10:24:05.736755 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef\": container with ID starting with 00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef not found: ID does not exist" containerID="00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef" Sep 29 10:24:05 crc kubenswrapper[4891]: I0929 10:24:05.736838 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef"} err="failed to get container status \"00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef\": rpc error: code = NotFound desc = could not find container \"00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef\": container with ID starting with 00776ac32883dae3e27d112942a9999eb4902fb66e9b79f73fc2cfde20cdeaef not found: ID does not exist" Sep 29 10:24:06 crc kubenswrapper[4891]: I0929 10:24:06.415478 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57924cea-825a-49d0-baee-d97c67037eef" path="/var/lib/kubelet/pods/57924cea-825a-49d0-baee-d97c67037eef/volumes" Sep 29 10:25:06 crc kubenswrapper[4891]: I0929 10:25:06.186437 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:25:06 crc kubenswrapper[4891]: I0929 10:25:06.187081 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:25:36 crc kubenswrapper[4891]: I0929 10:25:36.185998 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:25:36 crc kubenswrapper[4891]: I0929 10:25:36.186566 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.186700 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.187302 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.187357 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.188231 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.188301 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" gracePeriod=600 Sep 29 10:26:06 crc kubenswrapper[4891]: E0929 10:26:06.325507 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.797720 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" exitCode=0 Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.797779 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece"} Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.797840 4891 scope.go:117] "RemoveContainer" containerID="bff6fe3da678e36dd64ab83405868d66f2473cdc2a11bd70b1d1ce788f8ab4cf" Sep 29 10:26:06 crc kubenswrapper[4891]: I0929 10:26:06.798536 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:26:06 crc kubenswrapper[4891]: E0929 10:26:06.798869 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:26:19 crc kubenswrapper[4891]: I0929 10:26:19.395911 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:26:19 crc kubenswrapper[4891]: E0929 10:26:19.396806 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:26:33 crc kubenswrapper[4891]: I0929 10:26:33.395430 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:26:33 crc kubenswrapper[4891]: E0929 10:26:33.396176 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:26:46 crc kubenswrapper[4891]: I0929 10:26:46.395696 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:26:46 crc kubenswrapper[4891]: E0929 10:26:46.396684 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:26:59 crc kubenswrapper[4891]: I0929 10:26:59.396340 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:26:59 crc kubenswrapper[4891]: E0929 10:26:59.397053 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:27:06 crc kubenswrapper[4891]: I0929 10:27:06.353744 4891 generic.go:334] "Generic (PLEG): container finished" podID="ed5239b3-e586-4a56-89ce-74977f3509db" containerID="7a6fd8ce37c0b51978f17eaedb6d1b89a7c8fb02a4666d34dd14d4748fe12360" exitCode=0 Sep 29 10:27:06 crc kubenswrapper[4891]: I0929 10:27:06.353857 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" event={"ID":"ed5239b3-e586-4a56-89ce-74977f3509db","Type":"ContainerDied","Data":"7a6fd8ce37c0b51978f17eaedb6d1b89a7c8fb02a4666d34dd14d4748fe12360"} Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.774887 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.877186 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle\") pod \"ed5239b3-e586-4a56-89ce-74977f3509db\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.877255 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory\") pod \"ed5239b3-e586-4a56-89ce-74977f3509db\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.877358 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key\") pod \"ed5239b3-e586-4a56-89ce-74977f3509db\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.877456 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wvvn\" (UniqueName: \"kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn\") pod \"ed5239b3-e586-4a56-89ce-74977f3509db\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.877512 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0\") pod \"ed5239b3-e586-4a56-89ce-74977f3509db\" (UID: \"ed5239b3-e586-4a56-89ce-74977f3509db\") " Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.890236 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ed5239b3-e586-4a56-89ce-74977f3509db" (UID: "ed5239b3-e586-4a56-89ce-74977f3509db"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.896980 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn" (OuterVolumeSpecName: "kube-api-access-6wvvn") pod "ed5239b3-e586-4a56-89ce-74977f3509db" (UID: "ed5239b3-e586-4a56-89ce-74977f3509db"). InnerVolumeSpecName "kube-api-access-6wvvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.906242 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory" (OuterVolumeSpecName: "inventory") pod "ed5239b3-e586-4a56-89ce-74977f3509db" (UID: "ed5239b3-e586-4a56-89ce-74977f3509db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.911400 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ed5239b3-e586-4a56-89ce-74977f3509db" (UID: "ed5239b3-e586-4a56-89ce-74977f3509db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.920603 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ed5239b3-e586-4a56-89ce-74977f3509db" (UID: "ed5239b3-e586-4a56-89ce-74977f3509db"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.979304 4891 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.979346 4891 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.979361 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.979373 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed5239b3-e586-4a56-89ce-74977f3509db-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:27:07 crc kubenswrapper[4891]: I0929 10:27:07.979390 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wvvn\" (UniqueName: \"kubernetes.io/projected/ed5239b3-e586-4a56-89ce-74977f3509db-kube-api-access-6wvvn\") on node \"crc\" DevicePath \"\"" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.372853 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" event={"ID":"ed5239b3-e586-4a56-89ce-74977f3509db","Type":"ContainerDied","Data":"28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e"} Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.373209 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28119cf2140977338535f41387def677e1934bfa0c794393b2b9c63c733e2c7e" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.372918 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-l85z5" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.475831 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms"] Sep 29 10:27:08 crc kubenswrapper[4891]: E0929 10:27:08.476286 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="extract-content" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476308 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="extract-content" Sep 29 10:27:08 crc kubenswrapper[4891]: E0929 10:27:08.476327 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5239b3-e586-4a56-89ce-74977f3509db" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476337 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5239b3-e586-4a56-89ce-74977f3509db" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:27:08 crc kubenswrapper[4891]: E0929 10:27:08.476355 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="registry-server" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476365 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="registry-server" Sep 29 10:27:08 crc kubenswrapper[4891]: E0929 10:27:08.476385 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="extract-utilities" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476393 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="extract-utilities" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476622 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="57924cea-825a-49d0-baee-d97c67037eef" containerName="registry-server" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.476639 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5239b3-e586-4a56-89ce-74977f3509db" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.477445 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.479659 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.479679 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.480281 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.480440 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.480617 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.480808 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.481028 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.492210 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms"] Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.589768 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590108 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590241 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590421 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590601 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590733 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.590887 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.591099 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzq7l\" (UniqueName: \"kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.591439 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693246 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693342 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693385 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693424 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693468 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693562 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693607 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693650 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.693694 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzq7l\" (UniqueName: \"kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.695467 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.697624 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.698157 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.698522 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.698828 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.699285 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.701864 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.704656 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.715779 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzq7l\" (UniqueName: \"kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nbfms\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:08 crc kubenswrapper[4891]: I0929 10:27:08.830502 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:27:09 crc kubenswrapper[4891]: I0929 10:27:09.188687 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms"] Sep 29 10:27:09 crc kubenswrapper[4891]: I0929 10:27:09.194931 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:27:09 crc kubenswrapper[4891]: I0929 10:27:09.387771 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" event={"ID":"9257358d-6c0c-43ba-831e-c68505df09d8","Type":"ContainerStarted","Data":"0c41ae627d763a010eb2cd44f0d18b9369f9cef79f417e80576edf5a38545c5a"} Sep 29 10:27:12 crc kubenswrapper[4891]: I0929 10:27:12.416246 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" event={"ID":"9257358d-6c0c-43ba-831e-c68505df09d8","Type":"ContainerStarted","Data":"1e95716fa23891f2d312e68bd18711f2111ae9593021c171d87b0af8a5c713cd"} Sep 29 10:27:12 crc kubenswrapper[4891]: I0929 10:27:12.435707 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" podStartSLOduration=2.2825211420000002 podStartE2EDuration="4.435684758s" podCreationTimestamp="2025-09-29 10:27:08 +0000 UTC" firstStartedPulling="2025-09-29 10:27:09.194709613 +0000 UTC m=+2359.399877934" lastFinishedPulling="2025-09-29 10:27:11.347873229 +0000 UTC m=+2361.553041550" observedRunningTime="2025-09-29 10:27:12.432781514 +0000 UTC m=+2362.637949885" watchObservedRunningTime="2025-09-29 10:27:12.435684758 +0000 UTC m=+2362.640853079" Sep 29 10:27:13 crc kubenswrapper[4891]: I0929 10:27:13.395282 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:27:13 crc kubenswrapper[4891]: E0929 10:27:13.395984 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:27:24 crc kubenswrapper[4891]: I0929 10:27:24.396477 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:27:24 crc kubenswrapper[4891]: E0929 10:27:24.397318 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:27:37 crc kubenswrapper[4891]: I0929 10:27:37.396021 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:27:37 crc kubenswrapper[4891]: E0929 10:27:37.396750 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:27:52 crc kubenswrapper[4891]: I0929 10:27:52.397340 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:27:52 crc kubenswrapper[4891]: E0929 10:27:52.398485 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:28:03 crc kubenswrapper[4891]: I0929 10:28:03.396532 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:28:03 crc kubenswrapper[4891]: E0929 10:28:03.397429 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:28:15 crc kubenswrapper[4891]: I0929 10:28:15.395994 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:28:15 crc kubenswrapper[4891]: E0929 10:28:15.397014 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:28:27 crc kubenswrapper[4891]: I0929 10:28:27.396307 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:28:27 crc kubenswrapper[4891]: E0929 10:28:27.397089 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:28:42 crc kubenswrapper[4891]: I0929 10:28:42.396971 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:28:42 crc kubenswrapper[4891]: E0929 10:28:42.397688 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:28:57 crc kubenswrapper[4891]: I0929 10:28:57.397666 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:28:57 crc kubenswrapper[4891]: E0929 10:28:57.401210 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:29:11 crc kubenswrapper[4891]: I0929 10:29:11.396564 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:29:11 crc kubenswrapper[4891]: E0929 10:29:11.397634 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:29:22 crc kubenswrapper[4891]: I0929 10:29:22.396081 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:29:22 crc kubenswrapper[4891]: E0929 10:29:22.396877 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:29:33 crc kubenswrapper[4891]: I0929 10:29:33.397057 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:29:33 crc kubenswrapper[4891]: E0929 10:29:33.399384 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:29:47 crc kubenswrapper[4891]: I0929 10:29:47.395942 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:29:47 crc kubenswrapper[4891]: E0929 10:29:47.398809 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.147325 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72"] Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.149269 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.151740 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.153540 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.159024 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72"] Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.278938 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.279884 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.280020 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlg77\" (UniqueName: \"kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.381977 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.382495 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.382665 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlg77\" (UniqueName: \"kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.382956 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.401444 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.407348 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlg77\" (UniqueName: \"kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77\") pod \"collect-profiles-29319030-jcp72\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.473879 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:00 crc kubenswrapper[4891]: I0929 10:30:00.906013 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72"] Sep 29 10:30:01 crc kubenswrapper[4891]: I0929 10:30:01.395945 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:30:01 crc kubenswrapper[4891]: E0929 10:30:01.396258 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:30:01 crc kubenswrapper[4891]: I0929 10:30:01.913993 4891 generic.go:334] "Generic (PLEG): container finished" podID="75588803-34f1-46ee-90b6-edca66193b41" containerID="2cd1ae043ecf09fe3ded15369312f440a430fcdfcd2009ccadcda95d15b46036" exitCode=0 Sep 29 10:30:01 crc kubenswrapper[4891]: I0929 10:30:01.914055 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" event={"ID":"75588803-34f1-46ee-90b6-edca66193b41","Type":"ContainerDied","Data":"2cd1ae043ecf09fe3ded15369312f440a430fcdfcd2009ccadcda95d15b46036"} Sep 29 10:30:01 crc kubenswrapper[4891]: I0929 10:30:01.914093 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" event={"ID":"75588803-34f1-46ee-90b6-edca66193b41","Type":"ContainerStarted","Data":"2d913d5c8aff36ed40530a9ece54f6bd4160f1fbcee806d470c6b4c9dd01a6cd"} Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.300415 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.452954 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlg77\" (UniqueName: \"kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77\") pod \"75588803-34f1-46ee-90b6-edca66193b41\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.453052 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume\") pod \"75588803-34f1-46ee-90b6-edca66193b41\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.453362 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume\") pod \"75588803-34f1-46ee-90b6-edca66193b41\" (UID: \"75588803-34f1-46ee-90b6-edca66193b41\") " Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.454180 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume" (OuterVolumeSpecName: "config-volume") pod "75588803-34f1-46ee-90b6-edca66193b41" (UID: "75588803-34f1-46ee-90b6-edca66193b41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.461633 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77" (OuterVolumeSpecName: "kube-api-access-hlg77") pod "75588803-34f1-46ee-90b6-edca66193b41" (UID: "75588803-34f1-46ee-90b6-edca66193b41"). InnerVolumeSpecName "kube-api-access-hlg77". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.461663 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75588803-34f1-46ee-90b6-edca66193b41" (UID: "75588803-34f1-46ee-90b6-edca66193b41"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.555214 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlg77\" (UniqueName: \"kubernetes.io/projected/75588803-34f1-46ee-90b6-edca66193b41-kube-api-access-hlg77\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.555261 4891 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75588803-34f1-46ee-90b6-edca66193b41-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.555275 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75588803-34f1-46ee-90b6-edca66193b41-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.950406 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" event={"ID":"75588803-34f1-46ee-90b6-edca66193b41","Type":"ContainerDied","Data":"2d913d5c8aff36ed40530a9ece54f6bd4160f1fbcee806d470c6b4c9dd01a6cd"} Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.950463 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d913d5c8aff36ed40530a9ece54f6bd4160f1fbcee806d470c6b4c9dd01a6cd" Sep 29 10:30:03 crc kubenswrapper[4891]: I0929 10:30:03.950465 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-jcp72" Sep 29 10:30:04 crc kubenswrapper[4891]: I0929 10:30:04.375620 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh"] Sep 29 10:30:04 crc kubenswrapper[4891]: I0929 10:30:04.382673 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-cvhgh"] Sep 29 10:30:04 crc kubenswrapper[4891]: I0929 10:30:04.419043 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443c2a5c-8366-4170-80e7-063687c1caaf" path="/var/lib/kubelet/pods/443c2a5c-8366-4170-80e7-063687c1caaf/volumes" Sep 29 10:30:16 crc kubenswrapper[4891]: I0929 10:30:16.396560 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:30:16 crc kubenswrapper[4891]: E0929 10:30:16.397358 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:30:16 crc kubenswrapper[4891]: I0929 10:30:16.904664 4891 scope.go:117] "RemoveContainer" containerID="77cbba8bbf10cd7adc0d37679db0ce0c3b360ceb2e5f3f1efa0147177f7e80f7" Sep 29 10:30:31 crc kubenswrapper[4891]: I0929 10:30:31.206534 4891 generic.go:334] "Generic (PLEG): container finished" podID="9257358d-6c0c-43ba-831e-c68505df09d8" containerID="1e95716fa23891f2d312e68bd18711f2111ae9593021c171d87b0af8a5c713cd" exitCode=0 Sep 29 10:30:31 crc kubenswrapper[4891]: I0929 10:30:31.206684 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" event={"ID":"9257358d-6c0c-43ba-831e-c68505df09d8","Type":"ContainerDied","Data":"1e95716fa23891f2d312e68bd18711f2111ae9593021c171d87b0af8a5c713cd"} Sep 29 10:30:31 crc kubenswrapper[4891]: I0929 10:30:31.396579 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:30:31 crc kubenswrapper[4891]: E0929 10:30:31.397635 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.645308 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.807490 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.807537 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzq7l\" (UniqueName: \"kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.807582 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.807716 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.807742 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.808845 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.808922 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.808984 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.809068 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0\") pod \"9257358d-6c0c-43ba-831e-c68505df09d8\" (UID: \"9257358d-6c0c-43ba-831e-c68505df09d8\") " Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.814210 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.814343 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l" (OuterVolumeSpecName: "kube-api-access-fzq7l") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "kube-api-access-fzq7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.835028 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.839816 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory" (OuterVolumeSpecName: "inventory") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.840344 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.840933 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.841411 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.844842 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.856686 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9257358d-6c0c-43ba-831e-c68505df09d8" (UID: "9257358d-6c0c-43ba-831e-c68505df09d8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911318 4891 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911344 4891 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911353 4891 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911363 4891 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911374 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911382 4891 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911390 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9257358d-6c0c-43ba-831e-c68505df09d8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911399 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzq7l\" (UniqueName: \"kubernetes.io/projected/9257358d-6c0c-43ba-831e-c68505df09d8-kube-api-access-fzq7l\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:32 crc kubenswrapper[4891]: I0929 10:30:32.911408 4891 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9257358d-6c0c-43ba-831e-c68505df09d8-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.225109 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" event={"ID":"9257358d-6c0c-43ba-831e-c68505df09d8","Type":"ContainerDied","Data":"0c41ae627d763a010eb2cd44f0d18b9369f9cef79f417e80576edf5a38545c5a"} Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.225428 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c41ae627d763a010eb2cd44f0d18b9369f9cef79f417e80576edf5a38545c5a" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.225181 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nbfms" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.322108 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg"] Sep 29 10:30:33 crc kubenswrapper[4891]: E0929 10:30:33.322524 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9257358d-6c0c-43ba-831e-c68505df09d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.322546 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9257358d-6c0c-43ba-831e-c68505df09d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:30:33 crc kubenswrapper[4891]: E0929 10:30:33.322576 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75588803-34f1-46ee-90b6-edca66193b41" containerName="collect-profiles" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.322585 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="75588803-34f1-46ee-90b6-edca66193b41" containerName="collect-profiles" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.322842 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="75588803-34f1-46ee-90b6-edca66193b41" containerName="collect-profiles" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.322882 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9257358d-6c0c-43ba-831e-c68505df09d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.323582 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.326241 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b9rgd" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.326265 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.326431 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.328522 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.328607 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.344279 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg"] Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521684 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521729 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhlfh\" (UniqueName: \"kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521763 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521861 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521894 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521930 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.521978 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.624452 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.624653 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.624725 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.624880 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.625033 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.625279 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.625363 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhlfh\" (UniqueName: \"kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.628745 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.629116 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.629231 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.630755 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.646406 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.646512 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.653269 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhlfh\" (UniqueName: \"kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:33 crc kubenswrapper[4891]: I0929 10:30:33.941633 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:30:34 crc kubenswrapper[4891]: I0929 10:30:34.478049 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg"] Sep 29 10:30:35 crc kubenswrapper[4891]: I0929 10:30:35.244449 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" event={"ID":"451c7a1c-dd37-464d-b2c8-7924f1882509","Type":"ContainerStarted","Data":"ba9695c8b5dacf3da228c761a18fd798ae59f9a02425837094ba3e3eb83d5ce3"} Sep 29 10:30:36 crc kubenswrapper[4891]: I0929 10:30:36.254844 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" event={"ID":"451c7a1c-dd37-464d-b2c8-7924f1882509","Type":"ContainerStarted","Data":"e515b1cd97bea470462d493b807532caed5fd618f1a2e5971656ff536e96bad2"} Sep 29 10:30:36 crc kubenswrapper[4891]: I0929 10:30:36.272813 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" podStartSLOduration=2.628230083 podStartE2EDuration="3.27278196s" podCreationTimestamp="2025-09-29 10:30:33 +0000 UTC" firstStartedPulling="2025-09-29 10:30:34.483123903 +0000 UTC m=+2564.688292224" lastFinishedPulling="2025-09-29 10:30:35.12767578 +0000 UTC m=+2565.332844101" observedRunningTime="2025-09-29 10:30:36.271381151 +0000 UTC m=+2566.476549482" watchObservedRunningTime="2025-09-29 10:30:36.27278196 +0000 UTC m=+2566.477950281" Sep 29 10:30:46 crc kubenswrapper[4891]: I0929 10:30:46.395257 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:30:46 crc kubenswrapper[4891]: E0929 10:30:46.395998 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.717625 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.720879 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.779613 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.831881 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fzht\" (UniqueName: \"kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.832567 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.832626 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.934813 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.934895 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.935022 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fzht\" (UniqueName: \"kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.935349 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.935629 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:55 crc kubenswrapper[4891]: I0929 10:30:55.954975 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fzht\" (UniqueName: \"kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht\") pod \"redhat-operators-s9wnq\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:56 crc kubenswrapper[4891]: I0929 10:30:56.087640 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:30:56 crc kubenswrapper[4891]: I0929 10:30:56.553133 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:30:56 crc kubenswrapper[4891]: W0929 10:30:56.553329 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95274732_3841_49b2_81bc_877a464905e0.slice/crio-9013d1ffd357ecf00c0841dc9bc257088bfc61c256dd309a20bc0a0c296b9f57 WatchSource:0}: Error finding container 9013d1ffd357ecf00c0841dc9bc257088bfc61c256dd309a20bc0a0c296b9f57: Status 404 returned error can't find the container with id 9013d1ffd357ecf00c0841dc9bc257088bfc61c256dd309a20bc0a0c296b9f57 Sep 29 10:30:57 crc kubenswrapper[4891]: I0929 10:30:57.446115 4891 generic.go:334] "Generic (PLEG): container finished" podID="95274732-3841-49b2-81bc-877a464905e0" containerID="d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139" exitCode=0 Sep 29 10:30:57 crc kubenswrapper[4891]: I0929 10:30:57.446173 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerDied","Data":"d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139"} Sep 29 10:30:57 crc kubenswrapper[4891]: I0929 10:30:57.446458 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerStarted","Data":"9013d1ffd357ecf00c0841dc9bc257088bfc61c256dd309a20bc0a0c296b9f57"} Sep 29 10:30:58 crc kubenswrapper[4891]: I0929 10:30:58.457183 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerStarted","Data":"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21"} Sep 29 10:30:59 crc kubenswrapper[4891]: I0929 10:30:59.468750 4891 generic.go:334] "Generic (PLEG): container finished" podID="95274732-3841-49b2-81bc-877a464905e0" containerID="e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21" exitCode=0 Sep 29 10:30:59 crc kubenswrapper[4891]: I0929 10:30:59.468842 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerDied","Data":"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21"} Sep 29 10:31:00 crc kubenswrapper[4891]: I0929 10:31:00.405180 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:31:00 crc kubenswrapper[4891]: E0929 10:31:00.405645 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:31:00 crc kubenswrapper[4891]: I0929 10:31:00.479064 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerStarted","Data":"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6"} Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.087871 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.088418 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.134713 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.171364 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s9wnq" podStartSLOduration=8.694090944 podStartE2EDuration="11.171302225s" podCreationTimestamp="2025-09-29 10:30:55 +0000 UTC" firstStartedPulling="2025-09-29 10:30:57.44782353 +0000 UTC m=+2587.652991841" lastFinishedPulling="2025-09-29 10:30:59.925034791 +0000 UTC m=+2590.130203122" observedRunningTime="2025-09-29 10:31:00.505321554 +0000 UTC m=+2590.710489875" watchObservedRunningTime="2025-09-29 10:31:06.171302225 +0000 UTC m=+2596.376470546" Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.591910 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:06 crc kubenswrapper[4891]: I0929 10:31:06.650156 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:31:08 crc kubenswrapper[4891]: I0929 10:31:08.552328 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s9wnq" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="registry-server" containerID="cri-o://308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6" gracePeriod=2 Sep 29 10:31:08 crc kubenswrapper[4891]: I0929 10:31:08.983049 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.084762 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fzht\" (UniqueName: \"kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht\") pod \"95274732-3841-49b2-81bc-877a464905e0\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.085330 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content\") pod \"95274732-3841-49b2-81bc-877a464905e0\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.085399 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities\") pod \"95274732-3841-49b2-81bc-877a464905e0\" (UID: \"95274732-3841-49b2-81bc-877a464905e0\") " Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.086267 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities" (OuterVolumeSpecName: "utilities") pod "95274732-3841-49b2-81bc-877a464905e0" (UID: "95274732-3841-49b2-81bc-877a464905e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.089527 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht" (OuterVolumeSpecName: "kube-api-access-9fzht") pod "95274732-3841-49b2-81bc-877a464905e0" (UID: "95274732-3841-49b2-81bc-877a464905e0"). InnerVolumeSpecName "kube-api-access-9fzht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.187019 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.187047 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fzht\" (UniqueName: \"kubernetes.io/projected/95274732-3841-49b2-81bc-877a464905e0-kube-api-access-9fzht\") on node \"crc\" DevicePath \"\"" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.551736 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95274732-3841-49b2-81bc-877a464905e0" (UID: "95274732-3841-49b2-81bc-877a464905e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.565116 4891 generic.go:334] "Generic (PLEG): container finished" podID="95274732-3841-49b2-81bc-877a464905e0" containerID="308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6" exitCode=0 Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.565189 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerDied","Data":"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6"} Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.565232 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9wnq" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.565269 4891 scope.go:117] "RemoveContainer" containerID="308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.565227 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9wnq" event={"ID":"95274732-3841-49b2-81bc-877a464905e0","Type":"ContainerDied","Data":"9013d1ffd357ecf00c0841dc9bc257088bfc61c256dd309a20bc0a0c296b9f57"} Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.585109 4891 scope.go:117] "RemoveContainer" containerID="e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.601056 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95274732-3841-49b2-81bc-877a464905e0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.604631 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.612743 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s9wnq"] Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.618212 4891 scope.go:117] "RemoveContainer" containerID="d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.652823 4891 scope.go:117] "RemoveContainer" containerID="308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6" Sep 29 10:31:09 crc kubenswrapper[4891]: E0929 10:31:09.653454 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6\": container with ID starting with 308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6 not found: ID does not exist" containerID="308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.653564 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6"} err="failed to get container status \"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6\": rpc error: code = NotFound desc = could not find container \"308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6\": container with ID starting with 308f9fbb05a2b7d5bf1a554d89ceaf1f58d6f9be433a10f4b81b2fff26926ac6 not found: ID does not exist" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.653607 4891 scope.go:117] "RemoveContainer" containerID="e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21" Sep 29 10:31:09 crc kubenswrapper[4891]: E0929 10:31:09.653936 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21\": container with ID starting with e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21 not found: ID does not exist" containerID="e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.653972 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21"} err="failed to get container status \"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21\": rpc error: code = NotFound desc = could not find container \"e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21\": container with ID starting with e96c87494a63e347c07e268bc312c9168c3e9a2b1c217b9c61d08501169c8e21 not found: ID does not exist" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.654003 4891 scope.go:117] "RemoveContainer" containerID="d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139" Sep 29 10:31:09 crc kubenswrapper[4891]: E0929 10:31:09.654247 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139\": container with ID starting with d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139 not found: ID does not exist" containerID="d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139" Sep 29 10:31:09 crc kubenswrapper[4891]: I0929 10:31:09.654287 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139"} err="failed to get container status \"d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139\": rpc error: code = NotFound desc = could not find container \"d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139\": container with ID starting with d147dc74711a6bbc98511928a87a97b8c0bc805960637907988f4496bd669139 not found: ID does not exist" Sep 29 10:31:10 crc kubenswrapper[4891]: I0929 10:31:10.411137 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95274732-3841-49b2-81bc-877a464905e0" path="/var/lib/kubelet/pods/95274732-3841-49b2-81bc-877a464905e0/volumes" Sep 29 10:31:13 crc kubenswrapper[4891]: I0929 10:31:13.395775 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:31:13 crc kubenswrapper[4891]: I0929 10:31:13.602560 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6"} Sep 29 10:32:42 crc kubenswrapper[4891]: I0929 10:32:42.348842 4891 generic.go:334] "Generic (PLEG): container finished" podID="451c7a1c-dd37-464d-b2c8-7924f1882509" containerID="e515b1cd97bea470462d493b807532caed5fd618f1a2e5971656ff536e96bad2" exitCode=0 Sep 29 10:32:42 crc kubenswrapper[4891]: I0929 10:32:42.348941 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" event={"ID":"451c7a1c-dd37-464d-b2c8-7924f1882509","Type":"ContainerDied","Data":"e515b1cd97bea470462d493b807532caed5fd618f1a2e5971656ff536e96bad2"} Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.759961 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896016 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhlfh\" (UniqueName: \"kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896260 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896326 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896383 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896480 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896542 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.896588 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1\") pod \"451c7a1c-dd37-464d-b2c8-7924f1882509\" (UID: \"451c7a1c-dd37-464d-b2c8-7924f1882509\") " Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.902908 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh" (OuterVolumeSpecName: "kube-api-access-rhlfh") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "kube-api-access-rhlfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.913966 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.927152 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory" (OuterVolumeSpecName: "inventory") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.927816 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.929831 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.936506 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:43 crc kubenswrapper[4891]: I0929 10:32:43.938078 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "451c7a1c-dd37-464d-b2c8-7924f1882509" (UID: "451c7a1c-dd37-464d-b2c8-7924f1882509"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003334 4891 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003409 4891 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003424 4891 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003435 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhlfh\" (UniqueName: \"kubernetes.io/projected/451c7a1c-dd37-464d-b2c8-7924f1882509-kube-api-access-rhlfh\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003448 4891 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003480 4891 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.003489 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451c7a1c-dd37-464d-b2c8-7924f1882509-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.368033 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" event={"ID":"451c7a1c-dd37-464d-b2c8-7924f1882509","Type":"ContainerDied","Data":"ba9695c8b5dacf3da228c761a18fd798ae59f9a02425837094ba3e3eb83d5ce3"} Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.368088 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9695c8b5dacf3da228c761a18fd798ae59f9a02425837094ba3e3eb83d5ce3" Sep 29 10:32:44 crc kubenswrapper[4891]: I0929 10:32:44.368106 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.323514 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:00 crc kubenswrapper[4891]: E0929 10:33:00.324348 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="extract-content" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324362 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="extract-content" Sep 29 10:33:00 crc kubenswrapper[4891]: E0929 10:33:00.324371 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="registry-server" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324378 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="registry-server" Sep 29 10:33:00 crc kubenswrapper[4891]: E0929 10:33:00.324387 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451c7a1c-dd37-464d-b2c8-7924f1882509" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324394 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="451c7a1c-dd37-464d-b2c8-7924f1882509" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:33:00 crc kubenswrapper[4891]: E0929 10:33:00.324424 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="extract-utilities" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324430 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="extract-utilities" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324606 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="95274732-3841-49b2-81bc-877a464905e0" containerName="registry-server" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.324622 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="451c7a1c-dd37-464d-b2c8-7924f1882509" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.325892 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.333630 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.425761 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5vp8\" (UniqueName: \"kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.426233 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.426355 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.527769 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5vp8\" (UniqueName: \"kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.527878 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.527987 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.528584 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.528899 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.550426 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5vp8\" (UniqueName: \"kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8\") pod \"redhat-marketplace-b88cm\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:00 crc kubenswrapper[4891]: I0929 10:33:00.676232 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:01 crc kubenswrapper[4891]: I0929 10:33:01.214201 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:01 crc kubenswrapper[4891]: I0929 10:33:01.527366 4891 generic.go:334] "Generic (PLEG): container finished" podID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerID="3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338" exitCode=0 Sep 29 10:33:01 crc kubenswrapper[4891]: I0929 10:33:01.527668 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerDied","Data":"3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338"} Sep 29 10:33:01 crc kubenswrapper[4891]: I0929 10:33:01.527701 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerStarted","Data":"cd0dafde50c0ce039e9d340d6889e4e11bbf9b6c91f65c4c564bfcede46f73e6"} Sep 29 10:33:01 crc kubenswrapper[4891]: I0929 10:33:01.531601 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:33:02 crc kubenswrapper[4891]: I0929 10:33:02.538487 4891 generic.go:334] "Generic (PLEG): container finished" podID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerID="ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97" exitCode=0 Sep 29 10:33:02 crc kubenswrapper[4891]: I0929 10:33:02.538691 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerDied","Data":"ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97"} Sep 29 10:33:03 crc kubenswrapper[4891]: I0929 10:33:03.552750 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerStarted","Data":"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5"} Sep 29 10:33:03 crc kubenswrapper[4891]: I0929 10:33:03.579155 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b88cm" podStartSLOduration=2.189106297 podStartE2EDuration="3.579136985s" podCreationTimestamp="2025-09-29 10:33:00 +0000 UTC" firstStartedPulling="2025-09-29 10:33:01.531316869 +0000 UTC m=+2711.736485190" lastFinishedPulling="2025-09-29 10:33:02.921347557 +0000 UTC m=+2713.126515878" observedRunningTime="2025-09-29 10:33:03.574808174 +0000 UTC m=+2713.779976525" watchObservedRunningTime="2025-09-29 10:33:03.579136985 +0000 UTC m=+2713.784305316" Sep 29 10:33:10 crc kubenswrapper[4891]: I0929 10:33:10.676722 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:10 crc kubenswrapper[4891]: I0929 10:33:10.677620 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:10 crc kubenswrapper[4891]: I0929 10:33:10.752353 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:11 crc kubenswrapper[4891]: I0929 10:33:11.675481 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:11 crc kubenswrapper[4891]: I0929 10:33:11.726091 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:13 crc kubenswrapper[4891]: I0929 10:33:13.639504 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b88cm" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="registry-server" containerID="cri-o://9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5" gracePeriod=2 Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.055925 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.195201 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5vp8\" (UniqueName: \"kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8\") pod \"e682f4bf-acde-4985-b7d0-86f6c87b4142\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.195365 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities\") pod \"e682f4bf-acde-4985-b7d0-86f6c87b4142\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.195498 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content\") pod \"e682f4bf-acde-4985-b7d0-86f6c87b4142\" (UID: \"e682f4bf-acde-4985-b7d0-86f6c87b4142\") " Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.196261 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities" (OuterVolumeSpecName: "utilities") pod "e682f4bf-acde-4985-b7d0-86f6c87b4142" (UID: "e682f4bf-acde-4985-b7d0-86f6c87b4142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.206693 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8" (OuterVolumeSpecName: "kube-api-access-c5vp8") pod "e682f4bf-acde-4985-b7d0-86f6c87b4142" (UID: "e682f4bf-acde-4985-b7d0-86f6c87b4142"). InnerVolumeSpecName "kube-api-access-c5vp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.211410 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e682f4bf-acde-4985-b7d0-86f6c87b4142" (UID: "e682f4bf-acde-4985-b7d0-86f6c87b4142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.298156 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.298189 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5vp8\" (UniqueName: \"kubernetes.io/projected/e682f4bf-acde-4985-b7d0-86f6c87b4142-kube-api-access-c5vp8\") on node \"crc\" DevicePath \"\"" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.298202 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e682f4bf-acde-4985-b7d0-86f6c87b4142-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.650938 4891 generic.go:334] "Generic (PLEG): container finished" podID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerID="9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5" exitCode=0 Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.650995 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerDied","Data":"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5"} Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.651043 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b88cm" event={"ID":"e682f4bf-acde-4985-b7d0-86f6c87b4142","Type":"ContainerDied","Data":"cd0dafde50c0ce039e9d340d6889e4e11bbf9b6c91f65c4c564bfcede46f73e6"} Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.651060 4891 scope.go:117] "RemoveContainer" containerID="9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.651071 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b88cm" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.679916 4891 scope.go:117] "RemoveContainer" containerID="ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.686691 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.707399 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b88cm"] Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.713464 4891 scope.go:117] "RemoveContainer" containerID="3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.742909 4891 scope.go:117] "RemoveContainer" containerID="9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5" Sep 29 10:33:14 crc kubenswrapper[4891]: E0929 10:33:14.743243 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5\": container with ID starting with 9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5 not found: ID does not exist" containerID="9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.743276 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5"} err="failed to get container status \"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5\": rpc error: code = NotFound desc = could not find container \"9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5\": container with ID starting with 9a2ab9d183376251e6a2c20605f2c10282a6af4788d28db6fcac9254326c44d5 not found: ID does not exist" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.743297 4891 scope.go:117] "RemoveContainer" containerID="ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97" Sep 29 10:33:14 crc kubenswrapper[4891]: E0929 10:33:14.743566 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97\": container with ID starting with ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97 not found: ID does not exist" containerID="ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.743613 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97"} err="failed to get container status \"ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97\": rpc error: code = NotFound desc = could not find container \"ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97\": container with ID starting with ea3fc7723f81487bc8d30e7c51377980ee550306c43d862f0a8b7f570a115c97 not found: ID does not exist" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.743667 4891 scope.go:117] "RemoveContainer" containerID="3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338" Sep 29 10:33:14 crc kubenswrapper[4891]: E0929 10:33:14.743955 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338\": container with ID starting with 3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338 not found: ID does not exist" containerID="3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338" Sep 29 10:33:14 crc kubenswrapper[4891]: I0929 10:33:14.743984 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338"} err="failed to get container status \"3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338\": rpc error: code = NotFound desc = could not find container \"3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338\": container with ID starting with 3ad0394a296c5c508a882aada9817c4ca731cab7229697c33e7c747174990338 not found: ID does not exist" Sep 29 10:33:16 crc kubenswrapper[4891]: I0929 10:33:16.405370 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" path="/var/lib/kubelet/pods/e682f4bf-acde-4985-b7d0-86f6c87b4142/volumes" Sep 29 10:33:36 crc kubenswrapper[4891]: I0929 10:33:36.186063 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:33:36 crc kubenswrapper[4891]: I0929 10:33:36.186689 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.433731 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:33:38 crc kubenswrapper[4891]: E0929 10:33:38.434506 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="registry-server" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.434551 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="registry-server" Sep 29 10:33:38 crc kubenswrapper[4891]: E0929 10:33:38.434566 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="extract-utilities" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.434572 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="extract-utilities" Sep 29 10:33:38 crc kubenswrapper[4891]: E0929 10:33:38.434597 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="extract-content" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.434603 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="extract-content" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.434779 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="e682f4bf-acde-4985-b7d0-86f6c87b4142" containerName="registry-server" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.435551 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.438095 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.438324 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.438461 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kc85q" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.438568 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.451146 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.631931 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.631986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632023 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632064 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632360 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632450 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632569 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632726 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.632781 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwjn\" (UniqueName: \"kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.734923 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735009 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735107 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735127 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735152 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735196 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735223 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwjn\" (UniqueName: \"kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735256 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735271 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.735664 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.736488 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.737259 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.737871 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.738408 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.742619 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.754926 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.755036 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.757971 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwjn\" (UniqueName: \"kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:38 crc kubenswrapper[4891]: I0929 10:33:38.766470 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " pod="openstack/tempest-tests-tempest" Sep 29 10:33:39 crc kubenswrapper[4891]: I0929 10:33:39.058296 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:33:39 crc kubenswrapper[4891]: I0929 10:33:39.468743 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:33:39 crc kubenswrapper[4891]: W0929 10:33:39.471620 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae98e843_bdec_443e_8389_9a58c187f5bd.slice/crio-ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518 WatchSource:0}: Error finding container ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518: Status 404 returned error can't find the container with id ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518 Sep 29 10:33:39 crc kubenswrapper[4891]: I0929 10:33:39.858477 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae98e843-bdec-443e-8389-9a58c187f5bd","Type":"ContainerStarted","Data":"ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518"} Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.105712 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.111460 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.114782 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.209422 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbs8f\" (UniqueName: \"kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.209856 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.210051 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.312381 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbs8f\" (UniqueName: \"kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.312468 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.312526 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.313130 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.313223 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.335390 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbs8f\" (UniqueName: \"kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f\") pod \"community-operators-msxdd\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.439233 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:33:46 crc kubenswrapper[4891]: I0929 10:33:46.993229 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:33:46 crc kubenswrapper[4891]: W0929 10:33:46.996726 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7810d7_5a3c_44e5_b3e0_dac099d8abf7.slice/crio-9f573c119a40f7f9b23a5a789510f8d4761d11a92412524b8ed683386a65d0ee WatchSource:0}: Error finding container 9f573c119a40f7f9b23a5a789510f8d4761d11a92412524b8ed683386a65d0ee: Status 404 returned error can't find the container with id 9f573c119a40f7f9b23a5a789510f8d4761d11a92412524b8ed683386a65d0ee Sep 29 10:33:47 crc kubenswrapper[4891]: I0929 10:33:47.951239 4891 generic.go:334] "Generic (PLEG): container finished" podID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerID="ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e" exitCode=0 Sep 29 10:33:47 crc kubenswrapper[4891]: I0929 10:33:47.951347 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerDied","Data":"ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e"} Sep 29 10:33:47 crc kubenswrapper[4891]: I0929 10:33:47.951759 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerStarted","Data":"9f573c119a40f7f9b23a5a789510f8d4761d11a92412524b8ed683386a65d0ee"} Sep 29 10:33:49 crc kubenswrapper[4891]: I0929 10:33:49.976661 4891 generic.go:334] "Generic (PLEG): container finished" podID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerID="b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1" exitCode=0 Sep 29 10:33:49 crc kubenswrapper[4891]: I0929 10:33:49.977239 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerDied","Data":"b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1"} Sep 29 10:34:01 crc kubenswrapper[4891]: I0929 10:34:01.862021 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7df6f5468c-2kcvk" podUID="82a9d505-81c4-410a-9707-adb83f47f425" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 29 10:34:05 crc kubenswrapper[4891]: E0929 10:34:05.674299 4891 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Sep 29 10:34:05 crc kubenswrapper[4891]: E0929 10:34:05.674881 4891 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpwjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ae98e843-bdec-443e-8389-9a58c187f5bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:34:05 crc kubenswrapper[4891]: E0929 10:34:05.676807 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ae98e843-bdec-443e-8389-9a58c187f5bd" Sep 29 10:34:06 crc kubenswrapper[4891]: E0929 10:34:06.132797 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ae98e843-bdec-443e-8389-9a58c187f5bd" Sep 29 10:34:06 crc kubenswrapper[4891]: I0929 10:34:06.186288 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:34:06 crc kubenswrapper[4891]: I0929 10:34:06.186339 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:34:07 crc kubenswrapper[4891]: I0929 10:34:07.141988 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerStarted","Data":"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805"} Sep 29 10:34:07 crc kubenswrapper[4891]: I0929 10:34:07.168977 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-msxdd" podStartSLOduration=3.143106156 podStartE2EDuration="21.168954586s" podCreationTimestamp="2025-09-29 10:33:46 +0000 UTC" firstStartedPulling="2025-09-29 10:33:47.953449954 +0000 UTC m=+2758.158618275" lastFinishedPulling="2025-09-29 10:34:05.979298364 +0000 UTC m=+2776.184466705" observedRunningTime="2025-09-29 10:34:07.160998683 +0000 UTC m=+2777.366167024" watchObservedRunningTime="2025-09-29 10:34:07.168954586 +0000 UTC m=+2777.374122927" Sep 29 10:34:16 crc kubenswrapper[4891]: I0929 10:34:16.439463 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:16 crc kubenswrapper[4891]: I0929 10:34:16.440208 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:16 crc kubenswrapper[4891]: I0929 10:34:16.486072 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:17 crc kubenswrapper[4891]: I0929 10:34:17.283568 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:17 crc kubenswrapper[4891]: I0929 10:34:17.336625 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.246203 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-msxdd" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="registry-server" containerID="cri-o://306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805" gracePeriod=2 Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.726037 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.804418 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities\") pod \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.804577 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbs8f\" (UniqueName: \"kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f\") pod \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.804868 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content\") pod \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\" (UID: \"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7\") " Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.805127 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities" (OuterVolumeSpecName: "utilities") pod "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" (UID: "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.805434 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.810940 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f" (OuterVolumeSpecName: "kube-api-access-sbs8f") pod "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" (UID: "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7"). InnerVolumeSpecName "kube-api-access-sbs8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.870186 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" (UID: "1f7810d7-5a3c-44e5-b3e0-dac099d8abf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.907870 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.907904 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbs8f\" (UniqueName: \"kubernetes.io/projected/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7-kube-api-access-sbs8f\") on node \"crc\" DevicePath \"\"" Sep 29 10:34:19 crc kubenswrapper[4891]: I0929 10:34:19.920714 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.259967 4891 generic.go:334] "Generic (PLEG): container finished" podID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerID="306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805" exitCode=0 Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.260009 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerDied","Data":"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805"} Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.260035 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msxdd" event={"ID":"1f7810d7-5a3c-44e5-b3e0-dac099d8abf7","Type":"ContainerDied","Data":"9f573c119a40f7f9b23a5a789510f8d4761d11a92412524b8ed683386a65d0ee"} Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.260052 4891 scope.go:117] "RemoveContainer" containerID="306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.260061 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msxdd" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.282721 4891 scope.go:117] "RemoveContainer" containerID="b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.294728 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.301843 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-msxdd"] Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.322110 4891 scope.go:117] "RemoveContainer" containerID="ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.342964 4891 scope.go:117] "RemoveContainer" containerID="306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805" Sep 29 10:34:20 crc kubenswrapper[4891]: E0929 10:34:20.349313 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805\": container with ID starting with 306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805 not found: ID does not exist" containerID="306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.349388 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805"} err="failed to get container status \"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805\": rpc error: code = NotFound desc = could not find container \"306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805\": container with ID starting with 306db33407c9ef574c2be993dbe63ce3bf8602d05e35b62adab54b2e2917c805 not found: ID does not exist" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.349452 4891 scope.go:117] "RemoveContainer" containerID="b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1" Sep 29 10:34:20 crc kubenswrapper[4891]: E0929 10:34:20.350242 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1\": container with ID starting with b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1 not found: ID does not exist" containerID="b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.350283 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1"} err="failed to get container status \"b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1\": rpc error: code = NotFound desc = could not find container \"b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1\": container with ID starting with b80f204598633fc413b3ef4fbbcc931b74d7374a901de097e55235dd8381dfe1 not found: ID does not exist" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.350312 4891 scope.go:117] "RemoveContainer" containerID="ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e" Sep 29 10:34:20 crc kubenswrapper[4891]: E0929 10:34:20.350627 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e\": container with ID starting with ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e not found: ID does not exist" containerID="ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.350666 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e"} err="failed to get container status \"ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e\": rpc error: code = NotFound desc = could not find container \"ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e\": container with ID starting with ba04f0e257ef0e347c2fd098c49aa477124eb18870f23644548f649bb361168e not found: ID does not exist" Sep 29 10:34:20 crc kubenswrapper[4891]: I0929 10:34:20.410709 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" path="/var/lib/kubelet/pods/1f7810d7-5a3c-44e5-b3e0-dac099d8abf7/volumes" Sep 29 10:34:21 crc kubenswrapper[4891]: I0929 10:34:21.272948 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae98e843-bdec-443e-8389-9a58c187f5bd","Type":"ContainerStarted","Data":"1834a36e77aead21ea6fb7d3414bad933a20ea71cabd0b1fe78363461e0fbcb3"} Sep 29 10:34:21 crc kubenswrapper[4891]: I0929 10:34:21.295139 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.850443886 podStartE2EDuration="44.295121578s" podCreationTimestamp="2025-09-29 10:33:37 +0000 UTC" firstStartedPulling="2025-09-29 10:33:39.473370294 +0000 UTC m=+2749.678538605" lastFinishedPulling="2025-09-29 10:34:19.918047966 +0000 UTC m=+2790.123216297" observedRunningTime="2025-09-29 10:34:21.288836562 +0000 UTC m=+2791.494004903" watchObservedRunningTime="2025-09-29 10:34:21.295121578 +0000 UTC m=+2791.500289899" Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.186284 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.186937 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.186990 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.187753 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.187829 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6" gracePeriod=600 Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.424977 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6" exitCode=0 Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.425028 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6"} Sep 29 10:34:36 crc kubenswrapper[4891]: I0929 10:34:36.425068 4891 scope.go:117] "RemoveContainer" containerID="8e0e3386b28f9d42d75e45b7cc832e07a72736ad37a4ed31470e9ca5b3840ece" Sep 29 10:34:37 crc kubenswrapper[4891]: I0929 10:34:37.440140 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329"} Sep 29 10:36:36 crc kubenswrapper[4891]: I0929 10:36:36.186162 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:36:36 crc kubenswrapper[4891]: I0929 10:36:36.186725 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:37:06 crc kubenswrapper[4891]: I0929 10:37:06.186049 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:37:06 crc kubenswrapper[4891]: I0929 10:37:06.186900 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:37:36 crc kubenswrapper[4891]: I0929 10:37:36.186135 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:37:36 crc kubenswrapper[4891]: I0929 10:37:36.186690 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:37:36 crc kubenswrapper[4891]: I0929 10:37:36.186736 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:37:36 crc kubenswrapper[4891]: I0929 10:37:36.187316 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:37:36 crc kubenswrapper[4891]: I0929 10:37:36.187381 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" gracePeriod=600 Sep 29 10:37:36 crc kubenswrapper[4891]: E0929 10:37:36.315570 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:37:37 crc kubenswrapper[4891]: I0929 10:37:37.089473 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" exitCode=0 Sep 29 10:37:37 crc kubenswrapper[4891]: I0929 10:37:37.089537 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329"} Sep 29 10:37:37 crc kubenswrapper[4891]: I0929 10:37:37.089586 4891 scope.go:117] "RemoveContainer" containerID="3d8eb4b308d3b32817d1b6f98b14da560b979ce2f1778af16583ebb0639299b6" Sep 29 10:37:37 crc kubenswrapper[4891]: I0929 10:37:37.090580 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:37:37 crc kubenswrapper[4891]: E0929 10:37:37.091139 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:37:48 crc kubenswrapper[4891]: I0929 10:37:48.395924 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:37:48 crc kubenswrapper[4891]: E0929 10:37:48.396774 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:38:01 crc kubenswrapper[4891]: I0929 10:38:01.396089 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:38:01 crc kubenswrapper[4891]: E0929 10:38:01.396915 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:38:15 crc kubenswrapper[4891]: I0929 10:38:15.396080 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:38:15 crc kubenswrapper[4891]: E0929 10:38:15.398053 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:38:30 crc kubenswrapper[4891]: I0929 10:38:30.405159 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:38:30 crc kubenswrapper[4891]: E0929 10:38:30.406178 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:38:44 crc kubenswrapper[4891]: I0929 10:38:44.395944 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:38:44 crc kubenswrapper[4891]: E0929 10:38:44.397092 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:38:56 crc kubenswrapper[4891]: I0929 10:38:56.396130 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:38:56 crc kubenswrapper[4891]: E0929 10:38:56.396845 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:39:07 crc kubenswrapper[4891]: I0929 10:39:07.398108 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:39:07 crc kubenswrapper[4891]: E0929 10:39:07.399159 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:39:19 crc kubenswrapper[4891]: I0929 10:39:19.395603 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:39:19 crc kubenswrapper[4891]: E0929 10:39:19.396487 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:39:34 crc kubenswrapper[4891]: I0929 10:39:34.396411 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:39:34 crc kubenswrapper[4891]: E0929 10:39:34.415327 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:39:46 crc kubenswrapper[4891]: I0929 10:39:46.395889 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:39:46 crc kubenswrapper[4891]: E0929 10:39:46.396694 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:01 crc kubenswrapper[4891]: I0929 10:40:01.396185 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:40:01 crc kubenswrapper[4891]: E0929 10:40:01.397514 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:14 crc kubenswrapper[4891]: I0929 10:40:14.396892 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:40:14 crc kubenswrapper[4891]: E0929 10:40:14.397729 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:25 crc kubenswrapper[4891]: I0929 10:40:25.396045 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:40:25 crc kubenswrapper[4891]: E0929 10:40:25.396845 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:40 crc kubenswrapper[4891]: I0929 10:40:40.403257 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:40:40 crc kubenswrapper[4891]: E0929 10:40:40.404084 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.521306 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:45 crc kubenswrapper[4891]: E0929 10:40:45.522295 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="extract-utilities" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.522311 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="extract-utilities" Sep 29 10:40:45 crc kubenswrapper[4891]: E0929 10:40:45.522371 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="extract-content" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.522378 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="extract-content" Sep 29 10:40:45 crc kubenswrapper[4891]: E0929 10:40:45.522389 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="registry-server" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.522395 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="registry-server" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.522640 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7810d7-5a3c-44e5-b3e0-dac099d8abf7" containerName="registry-server" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.526941 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.555590 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrd6w\" (UniqueName: \"kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.555932 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.556090 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.559495 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.658281 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.658450 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.658543 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrd6w\" (UniqueName: \"kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.658930 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.658961 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.681745 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrd6w\" (UniqueName: \"kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w\") pod \"certified-operators-cb75z\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:45 crc kubenswrapper[4891]: I0929 10:40:45.855140 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:46 crc kubenswrapper[4891]: I0929 10:40:46.334871 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:46 crc kubenswrapper[4891]: I0929 10:40:46.913555 4891 generic.go:334] "Generic (PLEG): container finished" podID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerID="e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11" exitCode=0 Sep 29 10:40:46 crc kubenswrapper[4891]: I0929 10:40:46.913654 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerDied","Data":"e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11"} Sep 29 10:40:46 crc kubenswrapper[4891]: I0929 10:40:46.913897 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerStarted","Data":"6a664be9d5dae73bb6da7e8b09b8b28f13dcc851a5486b8dcf54ce1b8bc878e5"} Sep 29 10:40:46 crc kubenswrapper[4891]: I0929 10:40:46.916302 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:40:48 crc kubenswrapper[4891]: I0929 10:40:48.934690 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerStarted","Data":"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30"} Sep 29 10:40:49 crc kubenswrapper[4891]: I0929 10:40:49.942558 4891 generic.go:334] "Generic (PLEG): container finished" podID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerID="1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30" exitCode=0 Sep 29 10:40:49 crc kubenswrapper[4891]: I0929 10:40:49.942600 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerDied","Data":"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30"} Sep 29 10:40:50 crc kubenswrapper[4891]: I0929 10:40:50.954532 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerStarted","Data":"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9"} Sep 29 10:40:50 crc kubenswrapper[4891]: I0929 10:40:50.978043 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cb75z" podStartSLOduration=2.29594989 podStartE2EDuration="5.97802577s" podCreationTimestamp="2025-09-29 10:40:45 +0000 UTC" firstStartedPulling="2025-09-29 10:40:46.916054634 +0000 UTC m=+3177.121222945" lastFinishedPulling="2025-09-29 10:40:50.598130504 +0000 UTC m=+3180.803298825" observedRunningTime="2025-09-29 10:40:50.969743971 +0000 UTC m=+3181.174912312" watchObservedRunningTime="2025-09-29 10:40:50.97802577 +0000 UTC m=+3181.183194101" Sep 29 10:40:53 crc kubenswrapper[4891]: I0929 10:40:53.396468 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:40:53 crc kubenswrapper[4891]: E0929 10:40:53.397319 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:40:55 crc kubenswrapper[4891]: I0929 10:40:55.855349 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:55 crc kubenswrapper[4891]: I0929 10:40:55.856090 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:55 crc kubenswrapper[4891]: I0929 10:40:55.946257 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:56 crc kubenswrapper[4891]: I0929 10:40:56.080665 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:56 crc kubenswrapper[4891]: I0929 10:40:56.202376 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.031634 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cb75z" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="registry-server" containerID="cri-o://d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9" gracePeriod=2 Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.540220 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.694896 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities\") pod \"f39e4067-f11a-4374-a919-2cbb8e82e22c\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.695235 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content\") pod \"f39e4067-f11a-4374-a919-2cbb8e82e22c\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.695442 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrd6w\" (UniqueName: \"kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w\") pod \"f39e4067-f11a-4374-a919-2cbb8e82e22c\" (UID: \"f39e4067-f11a-4374-a919-2cbb8e82e22c\") " Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.695983 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities" (OuterVolumeSpecName: "utilities") pod "f39e4067-f11a-4374-a919-2cbb8e82e22c" (UID: "f39e4067-f11a-4374-a919-2cbb8e82e22c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.696322 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.704739 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w" (OuterVolumeSpecName: "kube-api-access-wrd6w") pod "f39e4067-f11a-4374-a919-2cbb8e82e22c" (UID: "f39e4067-f11a-4374-a919-2cbb8e82e22c"). InnerVolumeSpecName "kube-api-access-wrd6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.765679 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f39e4067-f11a-4374-a919-2cbb8e82e22c" (UID: "f39e4067-f11a-4374-a919-2cbb8e82e22c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.797883 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39e4067-f11a-4374-a919-2cbb8e82e22c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:58 crc kubenswrapper[4891]: I0929 10:40:58.797911 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrd6w\" (UniqueName: \"kubernetes.io/projected/f39e4067-f11a-4374-a919-2cbb8e82e22c-kube-api-access-wrd6w\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.045319 4891 generic.go:334] "Generic (PLEG): container finished" podID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerID="d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9" exitCode=0 Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.045390 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerDied","Data":"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9"} Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.045433 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb75z" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.045457 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb75z" event={"ID":"f39e4067-f11a-4374-a919-2cbb8e82e22c","Type":"ContainerDied","Data":"6a664be9d5dae73bb6da7e8b09b8b28f13dcc851a5486b8dcf54ce1b8bc878e5"} Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.045930 4891 scope.go:117] "RemoveContainer" containerID="d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.070320 4891 scope.go:117] "RemoveContainer" containerID="1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.120211 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.125825 4891 scope.go:117] "RemoveContainer" containerID="e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.130635 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cb75z"] Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.164736 4891 scope.go:117] "RemoveContainer" containerID="d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9" Sep 29 10:40:59 crc kubenswrapper[4891]: E0929 10:40:59.165310 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9\": container with ID starting with d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9 not found: ID does not exist" containerID="d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.165350 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9"} err="failed to get container status \"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9\": rpc error: code = NotFound desc = could not find container \"d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9\": container with ID starting with d7fe3c2ec4b85988e662e9be3963afa1dc1c242e8c475ebe4a2e598d314fbca9 not found: ID does not exist" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.165376 4891 scope.go:117] "RemoveContainer" containerID="1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30" Sep 29 10:40:59 crc kubenswrapper[4891]: E0929 10:40:59.165844 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30\": container with ID starting with 1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30 not found: ID does not exist" containerID="1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.165894 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30"} err="failed to get container status \"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30\": rpc error: code = NotFound desc = could not find container \"1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30\": container with ID starting with 1a01d9a84be3a75f333ca2f1fbfef2634a3bda988ce047376ecd1bb26e00aa30 not found: ID does not exist" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.165926 4891 scope.go:117] "RemoveContainer" containerID="e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11" Sep 29 10:40:59 crc kubenswrapper[4891]: E0929 10:40:59.166281 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11\": container with ID starting with e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11 not found: ID does not exist" containerID="e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11" Sep 29 10:40:59 crc kubenswrapper[4891]: I0929 10:40:59.166312 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11"} err="failed to get container status \"e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11\": rpc error: code = NotFound desc = could not find container \"e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11\": container with ID starting with e0ab44e0a6def6f93339a28983510bc9f878ee25def7ac0487131398dfd13e11 not found: ID does not exist" Sep 29 10:41:00 crc kubenswrapper[4891]: I0929 10:41:00.406692 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" path="/var/lib/kubelet/pods/f39e4067-f11a-4374-a919-2cbb8e82e22c/volumes" Sep 29 10:41:05 crc kubenswrapper[4891]: I0929 10:41:05.396118 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:41:05 crc kubenswrapper[4891]: E0929 10:41:05.396855 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:41:20 crc kubenswrapper[4891]: I0929 10:41:20.402944 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:41:20 crc kubenswrapper[4891]: E0929 10:41:20.404210 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:41:32 crc kubenswrapper[4891]: I0929 10:41:32.396306 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:41:32 crc kubenswrapper[4891]: E0929 10:41:32.397071 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:41:45 crc kubenswrapper[4891]: I0929 10:41:45.396040 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:41:45 crc kubenswrapper[4891]: E0929 10:41:45.396891 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:41:56 crc kubenswrapper[4891]: I0929 10:41:56.397339 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:41:56 crc kubenswrapper[4891]: E0929 10:41:56.398926 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:42:08 crc kubenswrapper[4891]: I0929 10:42:08.396295 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:42:08 crc kubenswrapper[4891]: E0929 10:42:08.397197 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:42:20 crc kubenswrapper[4891]: I0929 10:42:20.403045 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:42:20 crc kubenswrapper[4891]: E0929 10:42:20.404075 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.213187 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:21 crc kubenswrapper[4891]: E0929 10:42:21.213920 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="extract-content" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.213953 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="extract-content" Sep 29 10:42:21 crc kubenswrapper[4891]: E0929 10:42:21.213996 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="extract-utilities" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.214009 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="extract-utilities" Sep 29 10:42:21 crc kubenswrapper[4891]: E0929 10:42:21.214044 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="registry-server" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.214058 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="registry-server" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.214337 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39e4067-f11a-4374-a919-2cbb8e82e22c" containerName="registry-server" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.216115 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.224596 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.254041 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.254131 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.254287 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwwq\" (UniqueName: \"kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.355859 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.356420 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.356566 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwwq\" (UniqueName: \"kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.356612 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.356736 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.387734 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwwq\" (UniqueName: \"kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq\") pod \"redhat-operators-24lpn\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.540586 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:21 crc kubenswrapper[4891]: I0929 10:42:21.997236 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:22 crc kubenswrapper[4891]: I0929 10:42:22.864451 4891 generic.go:334] "Generic (PLEG): container finished" podID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerID="0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4" exitCode=0 Sep 29 10:42:22 crc kubenswrapper[4891]: I0929 10:42:22.864548 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerDied","Data":"0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4"} Sep 29 10:42:22 crc kubenswrapper[4891]: I0929 10:42:22.865016 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerStarted","Data":"d7145a686996a1ccc758c77b9113ad8ac4c3a89f2435bd2f8e83f1060e537ebe"} Sep 29 10:42:27 crc kubenswrapper[4891]: I0929 10:42:27.906842 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerStarted","Data":"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c"} Sep 29 10:42:31 crc kubenswrapper[4891]: I0929 10:42:31.396686 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:42:31 crc kubenswrapper[4891]: E0929 10:42:31.397835 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:42:32 crc kubenswrapper[4891]: I0929 10:42:32.966327 4891 generic.go:334] "Generic (PLEG): container finished" podID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerID="3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c" exitCode=0 Sep 29 10:42:32 crc kubenswrapper[4891]: I0929 10:42:32.966421 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerDied","Data":"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c"} Sep 29 10:42:33 crc kubenswrapper[4891]: I0929 10:42:33.982373 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerStarted","Data":"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3"} Sep 29 10:42:34 crc kubenswrapper[4891]: I0929 10:42:34.005495 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-24lpn" podStartSLOduration=2.280571387 podStartE2EDuration="13.005477673s" podCreationTimestamp="2025-09-29 10:42:21 +0000 UTC" firstStartedPulling="2025-09-29 10:42:22.867395482 +0000 UTC m=+3273.072563803" lastFinishedPulling="2025-09-29 10:42:33.592301768 +0000 UTC m=+3283.797470089" observedRunningTime="2025-09-29 10:42:33.999691956 +0000 UTC m=+3284.204860277" watchObservedRunningTime="2025-09-29 10:42:34.005477673 +0000 UTC m=+3284.210645994" Sep 29 10:42:41 crc kubenswrapper[4891]: I0929 10:42:41.543946 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:41 crc kubenswrapper[4891]: I0929 10:42:41.544545 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:41 crc kubenswrapper[4891]: I0929 10:42:41.622513 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:42 crc kubenswrapper[4891]: I0929 10:42:42.119156 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:42 crc kubenswrapper[4891]: I0929 10:42:42.790890 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.066300 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-24lpn" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="registry-server" containerID="cri-o://0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3" gracePeriod=2 Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.547770 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.716672 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content\") pod \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.716785 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnwwq\" (UniqueName: \"kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq\") pod \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.717214 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities\") pod \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\" (UID: \"046e01ea-d2ef-4013-9bc2-65a9b5f9caae\") " Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.718139 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities" (OuterVolumeSpecName: "utilities") pod "046e01ea-d2ef-4013-9bc2-65a9b5f9caae" (UID: "046e01ea-d2ef-4013-9bc2-65a9b5f9caae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.728409 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq" (OuterVolumeSpecName: "kube-api-access-gnwwq") pod "046e01ea-d2ef-4013-9bc2-65a9b5f9caae" (UID: "046e01ea-d2ef-4013-9bc2-65a9b5f9caae"). InnerVolumeSpecName "kube-api-access-gnwwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.799108 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "046e01ea-d2ef-4013-9bc2-65a9b5f9caae" (UID: "046e01ea-d2ef-4013-9bc2-65a9b5f9caae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.818598 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.818625 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:44 crc kubenswrapper[4891]: I0929 10:42:44.818636 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnwwq\" (UniqueName: \"kubernetes.io/projected/046e01ea-d2ef-4013-9bc2-65a9b5f9caae-kube-api-access-gnwwq\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.080985 4891 generic.go:334] "Generic (PLEG): container finished" podID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerID="0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3" exitCode=0 Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.081046 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerDied","Data":"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3"} Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.081124 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-24lpn" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.083193 4891 scope.go:117] "RemoveContainer" containerID="0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.083039 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-24lpn" event={"ID":"046e01ea-d2ef-4013-9bc2-65a9b5f9caae","Type":"ContainerDied","Data":"d7145a686996a1ccc758c77b9113ad8ac4c3a89f2435bd2f8e83f1060e537ebe"} Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.104408 4891 scope.go:117] "RemoveContainer" containerID="3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.148240 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.164921 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-24lpn"] Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.171729 4891 scope.go:117] "RemoveContainer" containerID="0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.215457 4891 scope.go:117] "RemoveContainer" containerID="0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3" Sep 29 10:42:45 crc kubenswrapper[4891]: E0929 10:42:45.215922 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3\": container with ID starting with 0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3 not found: ID does not exist" containerID="0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.215965 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3"} err="failed to get container status \"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3\": rpc error: code = NotFound desc = could not find container \"0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3\": container with ID starting with 0124960a6f85ae679c28586fa7cac0b5ff902c008458437f77f131e7488a7fa3 not found: ID does not exist" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.216022 4891 scope.go:117] "RemoveContainer" containerID="3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c" Sep 29 10:42:45 crc kubenswrapper[4891]: E0929 10:42:45.216422 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c\": container with ID starting with 3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c not found: ID does not exist" containerID="3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.216483 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c"} err="failed to get container status \"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c\": rpc error: code = NotFound desc = could not find container \"3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c\": container with ID starting with 3ca1ecf84ff09d21f5a75db5c2f50d11de860665ae51229e6307d1036285d03c not found: ID does not exist" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.216501 4891 scope.go:117] "RemoveContainer" containerID="0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4" Sep 29 10:42:45 crc kubenswrapper[4891]: E0929 10:42:45.217115 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4\": container with ID starting with 0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4 not found: ID does not exist" containerID="0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.217139 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4"} err="failed to get container status \"0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4\": rpc error: code = NotFound desc = could not find container \"0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4\": container with ID starting with 0537cfaf3064f475592d64c348de8d7beaa5db7e0c0c07595bc9a40d275e1bc4 not found: ID does not exist" Sep 29 10:42:45 crc kubenswrapper[4891]: I0929 10:42:45.397212 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:42:46 crc kubenswrapper[4891]: I0929 10:42:46.099193 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511"} Sep 29 10:42:46 crc kubenswrapper[4891]: I0929 10:42:46.408686 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" path="/var/lib/kubelet/pods/046e01ea-d2ef-4013-9bc2-65a9b5f9caae/volumes" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.053879 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:42 crc kubenswrapper[4891]: E0929 10:43:42.054711 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="registry-server" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.054725 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="registry-server" Sep 29 10:43:42 crc kubenswrapper[4891]: E0929 10:43:42.054754 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="extract-utilities" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.054760 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="extract-utilities" Sep 29 10:43:42 crc kubenswrapper[4891]: E0929 10:43:42.054771 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="extract-content" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.054779 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="extract-content" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.054973 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="046e01ea-d2ef-4013-9bc2-65a9b5f9caae" containerName="registry-server" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.056480 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.070308 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.108835 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjsfm\" (UniqueName: \"kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.108904 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.108986 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.210443 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjsfm\" (UniqueName: \"kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.210524 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.210633 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.211280 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.211292 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.229744 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjsfm\" (UniqueName: \"kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm\") pod \"redhat-marketplace-lncp9\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.428632 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:42 crc kubenswrapper[4891]: W0929 10:43:42.891959 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebde5e83_8a20_4660_a273_8988e0066642.slice/crio-b6ada65d0c62e59760792c76ed961cb675aa551672fea031d043d0ba8e6bb9ff WatchSource:0}: Error finding container b6ada65d0c62e59760792c76ed961cb675aa551672fea031d043d0ba8e6bb9ff: Status 404 returned error can't find the container with id b6ada65d0c62e59760792c76ed961cb675aa551672fea031d043d0ba8e6bb9ff Sep 29 10:43:42 crc kubenswrapper[4891]: I0929 10:43:42.895419 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:43 crc kubenswrapper[4891]: I0929 10:43:43.619173 4891 generic.go:334] "Generic (PLEG): container finished" podID="ebde5e83-8a20-4660-a273-8988e0066642" containerID="b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509" exitCode=0 Sep 29 10:43:43 crc kubenswrapper[4891]: I0929 10:43:43.619454 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerDied","Data":"b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509"} Sep 29 10:43:43 crc kubenswrapper[4891]: I0929 10:43:43.619513 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerStarted","Data":"b6ada65d0c62e59760792c76ed961cb675aa551672fea031d043d0ba8e6bb9ff"} Sep 29 10:43:45 crc kubenswrapper[4891]: I0929 10:43:45.642593 4891 generic.go:334] "Generic (PLEG): container finished" podID="ebde5e83-8a20-4660-a273-8988e0066642" containerID="ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3" exitCode=0 Sep 29 10:43:45 crc kubenswrapper[4891]: I0929 10:43:45.642737 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerDied","Data":"ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3"} Sep 29 10:43:46 crc kubenswrapper[4891]: I0929 10:43:46.653826 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerStarted","Data":"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7"} Sep 29 10:43:46 crc kubenswrapper[4891]: I0929 10:43:46.679100 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lncp9" podStartSLOduration=1.934428827 podStartE2EDuration="4.679087199s" podCreationTimestamp="2025-09-29 10:43:42 +0000 UTC" firstStartedPulling="2025-09-29 10:43:43.621863045 +0000 UTC m=+3353.827031366" lastFinishedPulling="2025-09-29 10:43:46.366521407 +0000 UTC m=+3356.571689738" observedRunningTime="2025-09-29 10:43:46.672328434 +0000 UTC m=+3356.877496755" watchObservedRunningTime="2025-09-29 10:43:46.679087199 +0000 UTC m=+3356.884255510" Sep 29 10:43:52 crc kubenswrapper[4891]: I0929 10:43:52.428811 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:52 crc kubenswrapper[4891]: I0929 10:43:52.429433 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:52 crc kubenswrapper[4891]: I0929 10:43:52.475675 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:52 crc kubenswrapper[4891]: I0929 10:43:52.751312 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:52 crc kubenswrapper[4891]: I0929 10:43:52.792460 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:54 crc kubenswrapper[4891]: I0929 10:43:54.732353 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lncp9" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="registry-server" containerID="cri-o://aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7" gracePeriod=2 Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.206126 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.362964 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content\") pod \"ebde5e83-8a20-4660-a273-8988e0066642\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.363180 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities\") pod \"ebde5e83-8a20-4660-a273-8988e0066642\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.363209 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjsfm\" (UniqueName: \"kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm\") pod \"ebde5e83-8a20-4660-a273-8988e0066642\" (UID: \"ebde5e83-8a20-4660-a273-8988e0066642\") " Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.364094 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities" (OuterVolumeSpecName: "utilities") pod "ebde5e83-8a20-4660-a273-8988e0066642" (UID: "ebde5e83-8a20-4660-a273-8988e0066642"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.371073 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm" (OuterVolumeSpecName: "kube-api-access-fjsfm") pod "ebde5e83-8a20-4660-a273-8988e0066642" (UID: "ebde5e83-8a20-4660-a273-8988e0066642"). InnerVolumeSpecName "kube-api-access-fjsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.382977 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebde5e83-8a20-4660-a273-8988e0066642" (UID: "ebde5e83-8a20-4660-a273-8988e0066642"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.465342 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.465381 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebde5e83-8a20-4660-a273-8988e0066642-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.465393 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjsfm\" (UniqueName: \"kubernetes.io/projected/ebde5e83-8a20-4660-a273-8988e0066642-kube-api-access-fjsfm\") on node \"crc\" DevicePath \"\"" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.744229 4891 generic.go:334] "Generic (PLEG): container finished" podID="ebde5e83-8a20-4660-a273-8988e0066642" containerID="aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7" exitCode=0 Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.744310 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lncp9" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.744302 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerDied","Data":"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7"} Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.744437 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lncp9" event={"ID":"ebde5e83-8a20-4660-a273-8988e0066642","Type":"ContainerDied","Data":"b6ada65d0c62e59760792c76ed961cb675aa551672fea031d043d0ba8e6bb9ff"} Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.744478 4891 scope.go:117] "RemoveContainer" containerID="aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.764542 4891 scope.go:117] "RemoveContainer" containerID="ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.785416 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.793996 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lncp9"] Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.810614 4891 scope.go:117] "RemoveContainer" containerID="b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.835576 4891 scope.go:117] "RemoveContainer" containerID="aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7" Sep 29 10:43:55 crc kubenswrapper[4891]: E0929 10:43:55.836274 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7\": container with ID starting with aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7 not found: ID does not exist" containerID="aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.836322 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7"} err="failed to get container status \"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7\": rpc error: code = NotFound desc = could not find container \"aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7\": container with ID starting with aa535ded1d6fa2449b3bddcccae85ff1fb16cdbe61dde446263bf872fef41be7 not found: ID does not exist" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.836349 4891 scope.go:117] "RemoveContainer" containerID="ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3" Sep 29 10:43:55 crc kubenswrapper[4891]: E0929 10:43:55.836697 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3\": container with ID starting with ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3 not found: ID does not exist" containerID="ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.836762 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3"} err="failed to get container status \"ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3\": rpc error: code = NotFound desc = could not find container \"ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3\": container with ID starting with ced7c5021bc84792a4d2566403fa6184fec516d376ad8e79e11e94a98917f9b3 not found: ID does not exist" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.836801 4891 scope.go:117] "RemoveContainer" containerID="b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509" Sep 29 10:43:55 crc kubenswrapper[4891]: E0929 10:43:55.837215 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509\": container with ID starting with b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509 not found: ID does not exist" containerID="b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509" Sep 29 10:43:55 crc kubenswrapper[4891]: I0929 10:43:55.837248 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509"} err="failed to get container status \"b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509\": rpc error: code = NotFound desc = could not find container \"b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509\": container with ID starting with b448b6bb850441ccff9e8be371823b9752b89cc55b495dcfbb5b537ac9a45509 not found: ID does not exist" Sep 29 10:43:56 crc kubenswrapper[4891]: I0929 10:43:56.419920 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebde5e83-8a20-4660-a273-8988e0066642" path="/var/lib/kubelet/pods/ebde5e83-8a20-4660-a273-8988e0066642/volumes" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.172648 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86"] Sep 29 10:45:00 crc kubenswrapper[4891]: E0929 10:45:00.173749 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="extract-content" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.173769 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="extract-content" Sep 29 10:45:00 crc kubenswrapper[4891]: E0929 10:45:00.173786 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="registry-server" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.173883 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="registry-server" Sep 29 10:45:00 crc kubenswrapper[4891]: E0929 10:45:00.173917 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="extract-utilities" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.173926 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="extract-utilities" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.174122 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebde5e83-8a20-4660-a273-8988e0066642" containerName="registry-server" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.174864 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.176931 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.178080 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.184483 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86"] Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.214370 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrq72\" (UniqueName: \"kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.214756 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.214843 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.316738 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrq72\" (UniqueName: \"kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.317136 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.317269 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.318330 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.330704 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.335610 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrq72\" (UniqueName: \"kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72\") pod \"collect-profiles-29319045-xpn86\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.498555 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:00 crc kubenswrapper[4891]: I0929 10:45:00.945355 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86"] Sep 29 10:45:01 crc kubenswrapper[4891]: I0929 10:45:01.360037 4891 generic.go:334] "Generic (PLEG): container finished" podID="bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" containerID="0da3913fe639b75a4bbbe3b0bc745213b6074ae3f8eafafa1435a74c4f631130" exitCode=0 Sep 29 10:45:01 crc kubenswrapper[4891]: I0929 10:45:01.360109 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" event={"ID":"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694","Type":"ContainerDied","Data":"0da3913fe639b75a4bbbe3b0bc745213b6074ae3f8eafafa1435a74c4f631130"} Sep 29 10:45:01 crc kubenswrapper[4891]: I0929 10:45:01.360303 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" event={"ID":"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694","Type":"ContainerStarted","Data":"1401984292ff13da5f204d2b6c3c783d41a735dc8e6945143452037e24c68df0"} Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.801080 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.883492 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrq72\" (UniqueName: \"kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72\") pod \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.883640 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume\") pod \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.883713 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume\") pod \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\" (UID: \"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694\") " Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.884597 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" (UID: "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.890360 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72" (OuterVolumeSpecName: "kube-api-access-lrq72") pod "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" (UID: "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694"). InnerVolumeSpecName "kube-api-access-lrq72". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.890474 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" (UID: "bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.986156 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrq72\" (UniqueName: \"kubernetes.io/projected/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-kube-api-access-lrq72\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.986205 4891 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:02 crc kubenswrapper[4891]: I0929 10:45:02.986215 4891 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:03 crc kubenswrapper[4891]: I0929 10:45:03.380557 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" event={"ID":"bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694","Type":"ContainerDied","Data":"1401984292ff13da5f204d2b6c3c783d41a735dc8e6945143452037e24c68df0"} Sep 29 10:45:03 crc kubenswrapper[4891]: I0929 10:45:03.380603 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-xpn86" Sep 29 10:45:03 crc kubenswrapper[4891]: I0929 10:45:03.380619 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1401984292ff13da5f204d2b6c3c783d41a735dc8e6945143452037e24c68df0" Sep 29 10:45:03 crc kubenswrapper[4891]: I0929 10:45:03.870377 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t"] Sep 29 10:45:03 crc kubenswrapper[4891]: I0929 10:45:03.877779 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-zw84t"] Sep 29 10:45:04 crc kubenswrapper[4891]: I0929 10:45:04.409984 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa97b41-25d5-4223-9fe0-bb9addf89617" path="/var/lib/kubelet/pods/7fa97b41-25d5-4223-9fe0-bb9addf89617/volumes" Sep 29 10:45:06 crc kubenswrapper[4891]: I0929 10:45:06.185869 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:45:06 crc kubenswrapper[4891]: I0929 10:45:06.185963 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:45:17 crc kubenswrapper[4891]: I0929 10:45:17.362662 4891 scope.go:117] "RemoveContainer" containerID="9f1d29ce791e74fa17280ecd74ef283a189c29a7f226f273237905f388bb3fbb" Sep 29 10:45:17 crc kubenswrapper[4891]: I0929 10:45:17.516951 4891 generic.go:334] "Generic (PLEG): container finished" podID="ae98e843-bdec-443e-8389-9a58c187f5bd" containerID="1834a36e77aead21ea6fb7d3414bad933a20ea71cabd0b1fe78363461e0fbcb3" exitCode=0 Sep 29 10:45:17 crc kubenswrapper[4891]: I0929 10:45:17.516998 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae98e843-bdec-443e-8389-9a58c187f5bd","Type":"ContainerDied","Data":"1834a36e77aead21ea6fb7d3414bad933a20ea71cabd0b1fe78363461e0fbcb3"} Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.867693 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.996952 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997036 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997102 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997176 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997217 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpwjn\" (UniqueName: \"kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997236 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997251 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997335 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:18 crc kubenswrapper[4891]: I0929 10:45:18.997389 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key\") pod \"ae98e843-bdec-443e-8389-9a58c187f5bd\" (UID: \"ae98e843-bdec-443e-8389-9a58c187f5bd\") " Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:18.999171 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:18.999968 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data" (OuterVolumeSpecName: "config-data") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.009466 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.009570 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn" (OuterVolumeSpecName: "kube-api-access-wpwjn") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "kube-api-access-wpwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.012967 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.028111 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.030051 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.031182 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.052044 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ae98e843-bdec-443e-8389-9a58c187f5bd" (UID: "ae98e843-bdec-443e-8389-9a58c187f5bd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099297 4891 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099338 4891 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099353 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpwjn\" (UniqueName: \"kubernetes.io/projected/ae98e843-bdec-443e-8389-9a58c187f5bd-kube-api-access-wpwjn\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099367 4891 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099404 4891 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099415 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099425 4891 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae98e843-bdec-443e-8389-9a58c187f5bd-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099434 4891 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae98e843-bdec-443e-8389-9a58c187f5bd-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.099444 4891 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ae98e843-bdec-443e-8389-9a58c187f5bd-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.119536 4891 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.201469 4891 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.534900 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ae98e843-bdec-443e-8389-9a58c187f5bd","Type":"ContainerDied","Data":"ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518"} Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.534947 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb40e02e55f099f9de3a7277cd0856ab814ee0f47fb765bee8b0aae2bc1d518" Sep 29 10:45:19 crc kubenswrapper[4891]: I0929 10:45:19.534994 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.702457 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:45:29 crc kubenswrapper[4891]: E0929 10:45:29.703500 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae98e843-bdec-443e-8389-9a58c187f5bd" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.703518 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae98e843-bdec-443e-8389-9a58c187f5bd" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:45:29 crc kubenswrapper[4891]: E0929 10:45:29.703534 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" containerName="collect-profiles" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.703543 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" containerName="collect-profiles" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.703757 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae98e843-bdec-443e-8389-9a58c187f5bd" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.703776 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3b655c-cf1d-4e8d-9c5d-78e42ae4b694" containerName="collect-profiles" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.704520 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.706587 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kc85q" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.708394 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.801086 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpzh\" (UniqueName: \"kubernetes.io/projected/0ae22834-3106-47d5-a04c-0ab9327991df-kube-api-access-7bpzh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.801481 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.903184 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpzh\" (UniqueName: \"kubernetes.io/projected/0ae22834-3106-47d5-a04c-0ab9327991df-kube-api-access-7bpzh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.903723 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.904408 4891 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.929336 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpzh\" (UniqueName: \"kubernetes.io/projected/0ae22834-3106-47d5-a04c-0ab9327991df-kube-api-access-7bpzh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:29 crc kubenswrapper[4891]: I0929 10:45:29.944059 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0ae22834-3106-47d5-a04c-0ab9327991df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:30 crc kubenswrapper[4891]: I0929 10:45:30.024618 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:45:30 crc kubenswrapper[4891]: I0929 10:45:30.482875 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:45:30 crc kubenswrapper[4891]: I0929 10:45:30.642654 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0ae22834-3106-47d5-a04c-0ab9327991df","Type":"ContainerStarted","Data":"977ff042e3f0db94e09e90346561f39dda3adab8ede7e8aebae6c39ce2c097d7"} Sep 29 10:45:32 crc kubenswrapper[4891]: I0929 10:45:32.662393 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0ae22834-3106-47d5-a04c-0ab9327991df","Type":"ContainerStarted","Data":"316892effafe3556575b3921d0edabcf34e8c9154c5791e50fe4fd956a2789ad"} Sep 29 10:45:32 crc kubenswrapper[4891]: I0929 10:45:32.691643 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.5273959489999998 podStartE2EDuration="3.691624094s" podCreationTimestamp="2025-09-29 10:45:29 +0000 UTC" firstStartedPulling="2025-09-29 10:45:30.490420487 +0000 UTC m=+3460.695588808" lastFinishedPulling="2025-09-29 10:45:31.654648622 +0000 UTC m=+3461.859816953" observedRunningTime="2025-09-29 10:45:32.677951439 +0000 UTC m=+3462.883119780" watchObservedRunningTime="2025-09-29 10:45:32.691624094 +0000 UTC m=+3462.896792435" Sep 29 10:45:36 crc kubenswrapper[4891]: I0929 10:45:36.185913 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:45:36 crc kubenswrapper[4891]: I0929 10:45:36.186535 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.620638 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.624359 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.658542 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.732916 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.733066 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.733151 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxz8v\" (UniqueName: \"kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.835523 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.835657 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.835722 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxz8v\" (UniqueName: \"kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.836617 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.836981 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.855187 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxz8v\" (UniqueName: \"kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v\") pod \"community-operators-86625\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:40 crc kubenswrapper[4891]: I0929 10:45:40.958058 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:41 crc kubenswrapper[4891]: I0929 10:45:41.471608 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:41 crc kubenswrapper[4891]: I0929 10:45:41.741222 4891 generic.go:334] "Generic (PLEG): container finished" podID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerID="961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c" exitCode=0 Sep 29 10:45:41 crc kubenswrapper[4891]: I0929 10:45:41.741286 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerDied","Data":"961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c"} Sep 29 10:45:41 crc kubenswrapper[4891]: I0929 10:45:41.741974 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerStarted","Data":"3089f917fad348f0cb4f57702aea02b6c5c62596a6244ddc662e16766de4f722"} Sep 29 10:45:43 crc kubenswrapper[4891]: I0929 10:45:43.761490 4891 generic.go:334] "Generic (PLEG): container finished" podID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerID="d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246" exitCode=0 Sep 29 10:45:43 crc kubenswrapper[4891]: I0929 10:45:43.761567 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerDied","Data":"d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246"} Sep 29 10:45:44 crc kubenswrapper[4891]: I0929 10:45:44.774976 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerStarted","Data":"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45"} Sep 29 10:45:44 crc kubenswrapper[4891]: I0929 10:45:44.795642 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86625" podStartSLOduration=2.371178396 podStartE2EDuration="4.795606045s" podCreationTimestamp="2025-09-29 10:45:40 +0000 UTC" firstStartedPulling="2025-09-29 10:45:41.742872991 +0000 UTC m=+3471.948041312" lastFinishedPulling="2025-09-29 10:45:44.16730064 +0000 UTC m=+3474.372468961" observedRunningTime="2025-09-29 10:45:44.794014219 +0000 UTC m=+3474.999182560" watchObservedRunningTime="2025-09-29 10:45:44.795606045 +0000 UTC m=+3475.000774366" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.202718 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ssd6w/must-gather-67qkg"] Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.205262 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.207550 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ssd6w"/"openshift-service-ca.crt" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.209098 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ssd6w"/"default-dockercfg-84rtj" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.209376 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ssd6w"/"kube-root-ca.crt" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.221248 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ssd6w/must-gather-67qkg"] Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.287926 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.288080 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fm8\" (UniqueName: \"kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.389469 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2fm8\" (UniqueName: \"kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.389660 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.390318 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.411653 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2fm8\" (UniqueName: \"kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8\") pod \"must-gather-67qkg\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.537356 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.990924 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:45:49 crc kubenswrapper[4891]: I0929 10:45:49.994430 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ssd6w/must-gather-67qkg"] Sep 29 10:45:50 crc kubenswrapper[4891]: I0929 10:45:50.874335 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/must-gather-67qkg" event={"ID":"686db550-749c-4c8f-9a0c-a962fe2b07c9","Type":"ContainerStarted","Data":"f49183c83502f2ea720968a5755b1fba8d6b0fdfc062e3f0d650d7681122c589"} Sep 29 10:45:50 crc kubenswrapper[4891]: I0929 10:45:50.958442 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:50 crc kubenswrapper[4891]: I0929 10:45:50.958835 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:51 crc kubenswrapper[4891]: I0929 10:45:51.021431 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:51 crc kubenswrapper[4891]: I0929 10:45:51.936219 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:51 crc kubenswrapper[4891]: I0929 10:45:51.991626 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:53 crc kubenswrapper[4891]: I0929 10:45:53.897762 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86625" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="registry-server" containerID="cri-o://ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45" gracePeriod=2 Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.436425 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.555459 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content\") pod \"47157309-f5a7-4570-8ee3-62edc859f4d8\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.556045 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities\") pod \"47157309-f5a7-4570-8ee3-62edc859f4d8\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.556150 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxz8v\" (UniqueName: \"kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v\") pod \"47157309-f5a7-4570-8ee3-62edc859f4d8\" (UID: \"47157309-f5a7-4570-8ee3-62edc859f4d8\") " Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.558112 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities" (OuterVolumeSpecName: "utilities") pod "47157309-f5a7-4570-8ee3-62edc859f4d8" (UID: "47157309-f5a7-4570-8ee3-62edc859f4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.565715 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v" (OuterVolumeSpecName: "kube-api-access-vxz8v") pod "47157309-f5a7-4570-8ee3-62edc859f4d8" (UID: "47157309-f5a7-4570-8ee3-62edc859f4d8"). InnerVolumeSpecName "kube-api-access-vxz8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.604093 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47157309-f5a7-4570-8ee3-62edc859f4d8" (UID: "47157309-f5a7-4570-8ee3-62edc859f4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.658941 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.658983 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxz8v\" (UniqueName: \"kubernetes.io/projected/47157309-f5a7-4570-8ee3-62edc859f4d8-kube-api-access-vxz8v\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.658995 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47157309-f5a7-4570-8ee3-62edc859f4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.913697 4891 generic.go:334] "Generic (PLEG): container finished" podID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerID="ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45" exitCode=0 Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.913753 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86625" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.913755 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerDied","Data":"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45"} Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.913935 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86625" event={"ID":"47157309-f5a7-4570-8ee3-62edc859f4d8","Type":"ContainerDied","Data":"3089f917fad348f0cb4f57702aea02b6c5c62596a6244ddc662e16766de4f722"} Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.913956 4891 scope.go:117] "RemoveContainer" containerID="ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.919267 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/must-gather-67qkg" event={"ID":"686db550-749c-4c8f-9a0c-a962fe2b07c9","Type":"ContainerStarted","Data":"eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed"} Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.919502 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/must-gather-67qkg" event={"ID":"686db550-749c-4c8f-9a0c-a962fe2b07c9","Type":"ContainerStarted","Data":"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34"} Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.953977 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ssd6w/must-gather-67qkg" podStartSLOduration=1.923616784 podStartE2EDuration="5.953952197s" podCreationTimestamp="2025-09-29 10:45:49 +0000 UTC" firstStartedPulling="2025-09-29 10:45:49.990616545 +0000 UTC m=+3480.195784866" lastFinishedPulling="2025-09-29 10:45:54.020951958 +0000 UTC m=+3484.226120279" observedRunningTime="2025-09-29 10:45:54.939950563 +0000 UTC m=+3485.145118904" watchObservedRunningTime="2025-09-29 10:45:54.953952197 +0000 UTC m=+3485.159120518" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.957976 4891 scope.go:117] "RemoveContainer" containerID="d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246" Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.974419 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.983351 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86625"] Sep 29 10:45:54 crc kubenswrapper[4891]: I0929 10:45:54.985117 4891 scope.go:117] "RemoveContainer" containerID="961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.009704 4891 scope.go:117] "RemoveContainer" containerID="ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45" Sep 29 10:45:55 crc kubenswrapper[4891]: E0929 10:45:55.010205 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45\": container with ID starting with ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45 not found: ID does not exist" containerID="ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.010232 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45"} err="failed to get container status \"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45\": rpc error: code = NotFound desc = could not find container \"ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45\": container with ID starting with ed918ba9562e5e1b0a41bd2bc03483b63eaf4f5f317008abb0eda3ff59186f45 not found: ID does not exist" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.010257 4891 scope.go:117] "RemoveContainer" containerID="d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246" Sep 29 10:45:55 crc kubenswrapper[4891]: E0929 10:45:55.010498 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246\": container with ID starting with d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246 not found: ID does not exist" containerID="d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.010514 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246"} err="failed to get container status \"d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246\": rpc error: code = NotFound desc = could not find container \"d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246\": container with ID starting with d4ce81e95db4da49aa1541f7d47ee31608e9ec3b18472239232e09ff164eb246 not found: ID does not exist" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.010527 4891 scope.go:117] "RemoveContainer" containerID="961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c" Sep 29 10:45:55 crc kubenswrapper[4891]: E0929 10:45:55.010735 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c\": container with ID starting with 961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c not found: ID does not exist" containerID="961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c" Sep 29 10:45:55 crc kubenswrapper[4891]: I0929 10:45:55.010777 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c"} err="failed to get container status \"961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c\": rpc error: code = NotFound desc = could not find container \"961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c\": container with ID starting with 961000c49c710160625508c3e4aa3fd990c6533b6f3ca2e266f516c658662d5c not found: ID does not exist" Sep 29 10:45:56 crc kubenswrapper[4891]: I0929 10:45:56.407832 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" path="/var/lib/kubelet/pods/47157309-f5a7-4570-8ee3-62edc859f4d8/volumes" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.742867 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-6w8vk"] Sep 29 10:45:59 crc kubenswrapper[4891]: E0929 10:45:59.743849 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="extract-content" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.743868 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="extract-content" Sep 29 10:45:59 crc kubenswrapper[4891]: E0929 10:45:59.743891 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="registry-server" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.743899 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="registry-server" Sep 29 10:45:59 crc kubenswrapper[4891]: E0929 10:45:59.743933 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="extract-utilities" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.743940 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="extract-utilities" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.744151 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="47157309-f5a7-4570-8ee3-62edc859f4d8" containerName="registry-server" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.744862 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.856380 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2sj\" (UniqueName: \"kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.856461 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.958705 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2sj\" (UniqueName: \"kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.958779 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.958897 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:45:59 crc kubenswrapper[4891]: I0929 10:45:59.978958 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2sj\" (UniqueName: \"kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj\") pod \"crc-debug-6w8vk\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:46:00 crc kubenswrapper[4891]: I0929 10:46:00.074505 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:46:00 crc kubenswrapper[4891]: W0929 10:46:00.111500 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21be1b0b_0393_499c_baa1_3d232bfe8d63.slice/crio-5481d3b4ccd694ab3c39cf51c9471a4b13c595db0cc484c9cd318292d32dd3fe WatchSource:0}: Error finding container 5481d3b4ccd694ab3c39cf51c9471a4b13c595db0cc484c9cd318292d32dd3fe: Status 404 returned error can't find the container with id 5481d3b4ccd694ab3c39cf51c9471a4b13c595db0cc484c9cd318292d32dd3fe Sep 29 10:46:00 crc kubenswrapper[4891]: I0929 10:46:00.993121 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" event={"ID":"21be1b0b-0393-499c-baa1-3d232bfe8d63","Type":"ContainerStarted","Data":"5481d3b4ccd694ab3c39cf51c9471a4b13c595db0cc484c9cd318292d32dd3fe"} Sep 29 10:46:06 crc kubenswrapper[4891]: I0929 10:46:06.186193 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:46:06 crc kubenswrapper[4891]: I0929 10:46:06.186759 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:46:06 crc kubenswrapper[4891]: I0929 10:46:06.186822 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:46:06 crc kubenswrapper[4891]: I0929 10:46:06.187594 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:46:06 crc kubenswrapper[4891]: I0929 10:46:06.187641 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511" gracePeriod=600 Sep 29 10:46:07 crc kubenswrapper[4891]: I0929 10:46:07.062701 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511" exitCode=0 Sep 29 10:46:07 crc kubenswrapper[4891]: I0929 10:46:07.062752 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511"} Sep 29 10:46:07 crc kubenswrapper[4891]: I0929 10:46:07.062851 4891 scope.go:117] "RemoveContainer" containerID="807968d8f47d2af873cef727ecd734773a1f1568ebaf31b560bc69ffef931329" Sep 29 10:46:12 crc kubenswrapper[4891]: I0929 10:46:12.117554 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4"} Sep 29 10:46:12 crc kubenswrapper[4891]: I0929 10:46:12.122449 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" event={"ID":"21be1b0b-0393-499c-baa1-3d232bfe8d63","Type":"ContainerStarted","Data":"7e549e280d66f2dcb85cc161562e5667b689c98e4c13af30cbe84300b384299a"} Sep 29 10:46:12 crc kubenswrapper[4891]: I0929 10:46:12.154421 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" podStartSLOduration=2.308127109 podStartE2EDuration="13.154399599s" podCreationTimestamp="2025-09-29 10:45:59 +0000 UTC" firstStartedPulling="2025-09-29 10:46:00.128413694 +0000 UTC m=+3490.333582015" lastFinishedPulling="2025-09-29 10:46:10.974686184 +0000 UTC m=+3501.179854505" observedRunningTime="2025-09-29 10:46:12.14716188 +0000 UTC m=+3502.352330221" watchObservedRunningTime="2025-09-29 10:46:12.154399599 +0000 UTC m=+3502.359567920" Sep 29 10:47:00 crc kubenswrapper[4891]: I0929 10:47:00.929991 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff9945478-9v77b_7efda6e6-9019-4909-96be-068496b2577f/barbican-api/0.log" Sep 29 10:47:00 crc kubenswrapper[4891]: I0929 10:47:00.957737 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff9945478-9v77b_7efda6e6-9019-4909-96be-068496b2577f/barbican-api-log/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.140005 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-748f5656b6-pdpff_d8d04caa-6db6-41c2-bf9b-f5ed373e9799/barbican-keystone-listener/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.167190 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-748f5656b6-pdpff_d8d04caa-6db6-41c2-bf9b-f5ed373e9799/barbican-keystone-listener-log/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.325229 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-677dd7cdbc-drpcv_e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99/barbican-worker/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.383991 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-677dd7cdbc-drpcv_e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99/barbican-worker-log/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.619333 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4_9c677d7a-2716-4c8d-8d87-7c158ca5de6c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.770510 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/ceilometer-central-agent/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.866773 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/ceilometer-notification-agent/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.939184 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/proxy-httpd/0.log" Sep 29 10:47:01 crc kubenswrapper[4891]: I0929 10:47:01.963775 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/sg-core/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.153758 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_77aa5ca3-797d-4f00-8f2d-d735b77d9965/cinder-api-log/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.176018 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_77aa5ca3-797d-4f00-8f2d-d735b77d9965/cinder-api/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.415970 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_67d0b646-e147-42d3-8ef9-9001b2b24313/probe/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.420071 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_67d0b646-e147-42d3-8ef9-9001b2b24313/cinder-scheduler/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.610427 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t_677e5a8c-37d1-41a1-bd47-2ef7af3a3570/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.741345 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl_ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:02 crc kubenswrapper[4891]: I0929 10:47:02.855655 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/init/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.041958 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/init/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.078914 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/dnsmasq-dns/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.235191 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr_bd3f0561-9568-4116-b84e-1209c964e50f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.265821 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ca02e873-8e2c-4958-a757-92efa57fdea8/glance-httpd/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.447833 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ca02e873-8e2c-4958-a757-92efa57fdea8/glance-log/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.547937 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6e7444d-97cc-440f-92de-e9db5ff440b5/glance-httpd/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.641939 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6e7444d-97cc-440f-92de-e9db5ff440b5/glance-log/0.log" Sep 29 10:47:03 crc kubenswrapper[4891]: I0929 10:47:03.819399 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d8cd8ff44-d8rc8_d464aff7-6448-4eaf-b88e-01a8acc3e42a/horizon/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.067235 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k_81439ac0-9a3d-434f-8122-90cc5eeeba97/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.230725 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d8cd8ff44-d8rc8_d464aff7-6448-4eaf-b88e-01a8acc3e42a/horizon-log/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.330976 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6kwtc_b33262be-68ab-40c1-a34e-c629096460a8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.580205 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_34c908de-54eb-4e12-a9a9-735fbf07c433/kube-state-metrics/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.628257 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b9ccf6696-q5jmg_b4050314-008a-4b46-93e7-2d9454fa3d89/keystone-api/0.log" Sep 29 10:47:04 crc kubenswrapper[4891]: I0929 10:47:04.820857 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-l85z5_ed5239b3-e586-4a56-89ce-74977f3509db/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:05 crc kubenswrapper[4891]: I0929 10:47:05.148019 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56d6cd75c7-6j75x_af72b6bb-1073-4ceb-b593-209e646bba5a/neutron-api/0.log" Sep 29 10:47:05 crc kubenswrapper[4891]: I0929 10:47:05.228708 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56d6cd75c7-6j75x_af72b6bb-1073-4ceb-b593-209e646bba5a/neutron-httpd/0.log" Sep 29 10:47:05 crc kubenswrapper[4891]: I0929 10:47:05.373901 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7_cc623a81-2fe0-42a2-8f61-cb9ab6909984/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:05 crc kubenswrapper[4891]: I0929 10:47:05.907578 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae3bbc79-5ed8-4064-bc90-554ca707171b/nova-api-log/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.076080 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae3bbc79-5ed8-4064-bc90-554ca707171b/nova-api-api/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.080667 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_01cf2585-dd79-4154-8567-2c24dee11709/nova-cell0-conductor-conductor/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.404718 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_854cde66-3c80-472d-b232-45231eef0bbd/nova-cell1-conductor-conductor/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.407420 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2e574b8c-0b11-4d63-a842-239dbbf69258/nova-cell1-novncproxy-novncproxy/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.748620 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nbfms_9257358d-6c0c-43ba-831e-c68505df09d8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:06 crc kubenswrapper[4891]: I0929 10:47:06.842369 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17f1cdf8-c8a7-42b7-a864-e89db1b08cb7/nova-metadata-log/0.log" Sep 29 10:47:07 crc kubenswrapper[4891]: I0929 10:47:07.291760 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_067392b2-a609-44b9-8796-26df77b11d9e/nova-scheduler-scheduler/0.log" Sep 29 10:47:07 crc kubenswrapper[4891]: I0929 10:47:07.414827 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/mysql-bootstrap/0.log" Sep 29 10:47:07 crc kubenswrapper[4891]: I0929 10:47:07.603541 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/mysql-bootstrap/0.log" Sep 29 10:47:07 crc kubenswrapper[4891]: I0929 10:47:07.641335 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/galera/0.log" Sep 29 10:47:07 crc kubenswrapper[4891]: I0929 10:47:07.884460 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/mysql-bootstrap/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.074728 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/mysql-bootstrap/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.120025 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/galera/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.268892 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17f1cdf8-c8a7-42b7-a864-e89db1b08cb7/nova-metadata-metadata/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.296809 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d3e0b825-0c6a-49ed-bb87-097ab0e686ee/openstackclient/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.537927 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jq4xk_7484acb7-f4b2-417b-a478-86b8c5999c34/ovn-controller/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.756372 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-czcgf_aeab10b4-2f08-4eed-88eb-ba6f26db6cd0/openstack-network-exporter/0.log" Sep 29 10:47:08 crc kubenswrapper[4891]: I0929 10:47:08.790392 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server-init/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.079298 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.106548 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovs-vswitchd/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.119673 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server-init/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.335894 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-frbbc_1dc04648-883f-4273-bf36-d550e5caba61/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.535775 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_942ef260-597a-42db-9123-1e9e0b1c4e1b/openstack-network-exporter/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.569633 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_942ef260-597a-42db-9123-1e9e0b1c4e1b/ovn-northd/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.769584 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_807cd996-3d20-4f16-b5bb-3b4e4da82775/openstack-network-exporter/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.802829 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_807cd996-3d20-4f16-b5bb-3b4e4da82775/ovsdbserver-nb/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.995549 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c6033a7a-34ac-409d-ab81-035b291364aa/openstack-network-exporter/0.log" Sep 29 10:47:09 crc kubenswrapper[4891]: I0929 10:47:09.998070 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c6033a7a-34ac-409d-ab81-035b291364aa/ovsdbserver-sb/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.284648 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-696f7ffc96-xhjxt_e8ec980b-adab-4378-a632-0de5186250dd/placement-api/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.297882 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-696f7ffc96-xhjxt_e8ec980b-adab-4378-a632-0de5186250dd/placement-log/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.494627 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/setup-container/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.763069 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/setup-container/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.764478 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/rabbitmq/0.log" Sep 29 10:47:10 crc kubenswrapper[4891]: I0929 10:47:10.938228 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/setup-container/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.233443 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/rabbitmq/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.240193 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/setup-container/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.463816 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5_cd0ca11c-98f5-4734-bc9a-fef72b1004f8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.479468 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6t8ns_d51c52c6-2e99-465a-9654-c58b12dd213e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.685466 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d_6398fd57-3f6f-4c01-98da-81f0ad16c4a6/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:11 crc kubenswrapper[4891]: I0929 10:47:11.912860 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2zlpr_92316376-b91d-4e78-ac0c-6f03f1be5f26/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.030625 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-68vb5_e5a83354-1dda-4488-a048-16ac1b5f36f5/ssh-known-hosts-edpm-deployment/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.309770 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df6f5468c-2kcvk_82a9d505-81c4-410a-9707-adb83f47f425/proxy-server/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.379150 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df6f5468c-2kcvk_82a9d505-81c4-410a-9707-adb83f47f425/proxy-httpd/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.552940 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fnxwg_becd282d-9d1a-4bf8-8e48-cdbab75047e1/swift-ring-rebalance/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.628018 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-auditor/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.761372 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-reaper/0.log" Sep 29 10:47:12 crc kubenswrapper[4891]: I0929 10:47:12.967908 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-replicator/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.018185 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-server/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.082580 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-auditor/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.175596 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-replicator/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.260901 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-updater/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.293463 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-server/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.447728 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-auditor/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.494081 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-replicator/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.528441 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-expirer/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.632446 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-server/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.714479 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/rsync/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.716442 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-updater/0.log" Sep 29 10:47:13 crc kubenswrapper[4891]: I0929 10:47:13.834851 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/swift-recon-cron/0.log" Sep 29 10:47:14 crc kubenswrapper[4891]: I0929 10:47:14.015755 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg_451c7a1c-dd37-464d-b2c8-7924f1882509/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:14 crc kubenswrapper[4891]: I0929 10:47:14.240073 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ae98e843-bdec-443e-8389-9a58c187f5bd/tempest-tests-tempest-tests-runner/0.log" Sep 29 10:47:14 crc kubenswrapper[4891]: I0929 10:47:14.313688 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0ae22834-3106-47d5-a04c-0ab9327991df/test-operator-logs-container/0.log" Sep 29 10:47:14 crc kubenswrapper[4891]: I0929 10:47:14.514567 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs_a4406439-b507-4572-b458-58d0ddf2b94d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:47:23 crc kubenswrapper[4891]: I0929 10:47:23.383978 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_9ec260f8-616d-4e46-8685-0dcabdf10a16/memcached/0.log" Sep 29 10:48:12 crc kubenswrapper[4891]: I0929 10:48:12.255623 4891 generic.go:334] "Generic (PLEG): container finished" podID="21be1b0b-0393-499c-baa1-3d232bfe8d63" containerID="7e549e280d66f2dcb85cc161562e5667b689c98e4c13af30cbe84300b384299a" exitCode=0 Sep 29 10:48:12 crc kubenswrapper[4891]: I0929 10:48:12.255693 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" event={"ID":"21be1b0b-0393-499c-baa1-3d232bfe8d63","Type":"ContainerDied","Data":"7e549e280d66f2dcb85cc161562e5667b689c98e4c13af30cbe84300b384299a"} Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.364856 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.396280 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-6w8vk"] Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.403844 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-6w8vk"] Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.492462 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2sj\" (UniqueName: \"kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj\") pod \"21be1b0b-0393-499c-baa1-3d232bfe8d63\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.492563 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host\") pod \"21be1b0b-0393-499c-baa1-3d232bfe8d63\" (UID: \"21be1b0b-0393-499c-baa1-3d232bfe8d63\") " Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.492746 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host" (OuterVolumeSpecName: "host") pod "21be1b0b-0393-499c-baa1-3d232bfe8d63" (UID: "21be1b0b-0393-499c-baa1-3d232bfe8d63"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.493831 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21be1b0b-0393-499c-baa1-3d232bfe8d63-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.498395 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj" (OuterVolumeSpecName: "kube-api-access-dv2sj") pod "21be1b0b-0393-499c-baa1-3d232bfe8d63" (UID: "21be1b0b-0393-499c-baa1-3d232bfe8d63"). InnerVolumeSpecName "kube-api-access-dv2sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:13 crc kubenswrapper[4891]: I0929 10:48:13.595993 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv2sj\" (UniqueName: \"kubernetes.io/projected/21be1b0b-0393-499c-baa1-3d232bfe8d63-kube-api-access-dv2sj\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.276264 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5481d3b4ccd694ab3c39cf51c9471a4b13c595db0cc484c9cd318292d32dd3fe" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.276335 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-6w8vk" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.413749 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21be1b0b-0393-499c-baa1-3d232bfe8d63" path="/var/lib/kubelet/pods/21be1b0b-0393-499c-baa1-3d232bfe8d63/volumes" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.566571 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-vks8h"] Sep 29 10:48:14 crc kubenswrapper[4891]: E0929 10:48:14.567014 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21be1b0b-0393-499c-baa1-3d232bfe8d63" containerName="container-00" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.567029 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="21be1b0b-0393-499c-baa1-3d232bfe8d63" containerName="container-00" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.567206 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="21be1b0b-0393-499c-baa1-3d232bfe8d63" containerName="container-00" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.567848 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.714878 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.715091 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm8jq\" (UniqueName: \"kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.817069 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.817230 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.817257 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm8jq\" (UniqueName: \"kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.833836 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm8jq\" (UniqueName: \"kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq\") pod \"crc-debug-vks8h\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:14 crc kubenswrapper[4891]: I0929 10:48:14.892315 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:15 crc kubenswrapper[4891]: I0929 10:48:15.287608 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" event={"ID":"a3a1ee15-524b-46fb-92ad-755723e16afd","Type":"ContainerStarted","Data":"5befeac4ae7041e4c33b93beb0d10eb906e538577fe32b4f025f9d2f72aaffac"} Sep 29 10:48:15 crc kubenswrapper[4891]: I0929 10:48:15.287668 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" event={"ID":"a3a1ee15-524b-46fb-92ad-755723e16afd","Type":"ContainerStarted","Data":"27d605bcb70c07bf75edd10437d33dd66a8a77d4a6ab522910f1cb33f0159b58"} Sep 29 10:48:15 crc kubenswrapper[4891]: I0929 10:48:15.306154 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" podStartSLOduration=1.306134514 podStartE2EDuration="1.306134514s" podCreationTimestamp="2025-09-29 10:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:48:15.305902407 +0000 UTC m=+3625.511070728" watchObservedRunningTime="2025-09-29 10:48:15.306134514 +0000 UTC m=+3625.511302835" Sep 29 10:48:16 crc kubenswrapper[4891]: I0929 10:48:16.298424 4891 generic.go:334] "Generic (PLEG): container finished" podID="a3a1ee15-524b-46fb-92ad-755723e16afd" containerID="5befeac4ae7041e4c33b93beb0d10eb906e538577fe32b4f025f9d2f72aaffac" exitCode=0 Sep 29 10:48:16 crc kubenswrapper[4891]: I0929 10:48:16.298555 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" event={"ID":"a3a1ee15-524b-46fb-92ad-755723e16afd","Type":"ContainerDied","Data":"5befeac4ae7041e4c33b93beb0d10eb906e538577fe32b4f025f9d2f72aaffac"} Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.398279 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.572348 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm8jq\" (UniqueName: \"kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq\") pod \"a3a1ee15-524b-46fb-92ad-755723e16afd\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.572427 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host\") pod \"a3a1ee15-524b-46fb-92ad-755723e16afd\" (UID: \"a3a1ee15-524b-46fb-92ad-755723e16afd\") " Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.572517 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host" (OuterVolumeSpecName: "host") pod "a3a1ee15-524b-46fb-92ad-755723e16afd" (UID: "a3a1ee15-524b-46fb-92ad-755723e16afd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.574260 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a1ee15-524b-46fb-92ad-755723e16afd-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.593371 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq" (OuterVolumeSpecName: "kube-api-access-vm8jq") pod "a3a1ee15-524b-46fb-92ad-755723e16afd" (UID: "a3a1ee15-524b-46fb-92ad-755723e16afd"). InnerVolumeSpecName "kube-api-access-vm8jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:17 crc kubenswrapper[4891]: I0929 10:48:17.675372 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm8jq\" (UniqueName: \"kubernetes.io/projected/a3a1ee15-524b-46fb-92ad-755723e16afd-kube-api-access-vm8jq\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:18 crc kubenswrapper[4891]: I0929 10:48:18.313143 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" event={"ID":"a3a1ee15-524b-46fb-92ad-755723e16afd","Type":"ContainerDied","Data":"27d605bcb70c07bf75edd10437d33dd66a8a77d4a6ab522910f1cb33f0159b58"} Sep 29 10:48:18 crc kubenswrapper[4891]: I0929 10:48:18.313459 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27d605bcb70c07bf75edd10437d33dd66a8a77d4a6ab522910f1cb33f0159b58" Sep 29 10:48:18 crc kubenswrapper[4891]: I0929 10:48:18.313204 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-vks8h" Sep 29 10:48:21 crc kubenswrapper[4891]: I0929 10:48:21.826335 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-vks8h"] Sep 29 10:48:21 crc kubenswrapper[4891]: I0929 10:48:21.841024 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-vks8h"] Sep 29 10:48:22 crc kubenswrapper[4891]: I0929 10:48:22.408232 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a1ee15-524b-46fb-92ad-755723e16afd" path="/var/lib/kubelet/pods/a3a1ee15-524b-46fb-92ad-755723e16afd/volumes" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.002921 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-mjtt2"] Sep 29 10:48:23 crc kubenswrapper[4891]: E0929 10:48:23.003590 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a1ee15-524b-46fb-92ad-755723e16afd" containerName="container-00" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.003603 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a1ee15-524b-46fb-92ad-755723e16afd" containerName="container-00" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.003825 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a1ee15-524b-46fb-92ad-755723e16afd" containerName="container-00" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.004416 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.168200 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pks6\" (UniqueName: \"kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.168299 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.270570 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pks6\" (UniqueName: \"kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.270951 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.271036 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.300696 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pks6\" (UniqueName: \"kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6\") pod \"crc-debug-mjtt2\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: I0929 10:48:23.323820 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:23 crc kubenswrapper[4891]: W0929 10:48:23.375097 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30320c5e_3ad3_476f_ae27_50d306a5a61d.slice/crio-bb5b6c5b4f4970bfae83ceee89a5f89fa1f62e479fc05244f9812890f85800a5 WatchSource:0}: Error finding container bb5b6c5b4f4970bfae83ceee89a5f89fa1f62e479fc05244f9812890f85800a5: Status 404 returned error can't find the container with id bb5b6c5b4f4970bfae83ceee89a5f89fa1f62e479fc05244f9812890f85800a5 Sep 29 10:48:24 crc kubenswrapper[4891]: I0929 10:48:24.384166 4891 generic.go:334] "Generic (PLEG): container finished" podID="30320c5e-3ad3-476f-ae27-50d306a5a61d" containerID="7e3a8ef30923ef454c897c80db116fe7e0a209fe6bd26224771825cdf3a57c25" exitCode=0 Sep 29 10:48:24 crc kubenswrapper[4891]: I0929 10:48:24.384288 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" event={"ID":"30320c5e-3ad3-476f-ae27-50d306a5a61d","Type":"ContainerDied","Data":"7e3a8ef30923ef454c897c80db116fe7e0a209fe6bd26224771825cdf3a57c25"} Sep 29 10:48:24 crc kubenswrapper[4891]: I0929 10:48:24.384921 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" event={"ID":"30320c5e-3ad3-476f-ae27-50d306a5a61d","Type":"ContainerStarted","Data":"bb5b6c5b4f4970bfae83ceee89a5f89fa1f62e479fc05244f9812890f85800a5"} Sep 29 10:48:24 crc kubenswrapper[4891]: I0929 10:48:24.434910 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-mjtt2"] Sep 29 10:48:24 crc kubenswrapper[4891]: I0929 10:48:24.444639 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ssd6w/crc-debug-mjtt2"] Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.496082 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.615425 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host\") pod \"30320c5e-3ad3-476f-ae27-50d306a5a61d\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.615510 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host" (OuterVolumeSpecName: "host") pod "30320c5e-3ad3-476f-ae27-50d306a5a61d" (UID: "30320c5e-3ad3-476f-ae27-50d306a5a61d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.615620 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pks6\" (UniqueName: \"kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6\") pod \"30320c5e-3ad3-476f-ae27-50d306a5a61d\" (UID: \"30320c5e-3ad3-476f-ae27-50d306a5a61d\") " Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.616131 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30320c5e-3ad3-476f-ae27-50d306a5a61d-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.621568 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6" (OuterVolumeSpecName: "kube-api-access-4pks6") pod "30320c5e-3ad3-476f-ae27-50d306a5a61d" (UID: "30320c5e-3ad3-476f-ae27-50d306a5a61d"). InnerVolumeSpecName "kube-api-access-4pks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:25 crc kubenswrapper[4891]: I0929 10:48:25.717778 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pks6\" (UniqueName: \"kubernetes.io/projected/30320c5e-3ad3-476f-ae27-50d306a5a61d-kube-api-access-4pks6\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.167007 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.386710 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.403598 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/crc-debug-mjtt2" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.408563 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30320c5e-3ad3-476f-ae27-50d306a5a61d" path="/var/lib/kubelet/pods/30320c5e-3ad3-476f-ae27-50d306a5a61d/volumes" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.409541 4891 scope.go:117] "RemoveContainer" containerID="7e3a8ef30923ef454c897c80db116fe7e0a209fe6bd26224771825cdf3a57c25" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.443963 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.470345 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.655461 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.709322 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.710315 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/extract/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.887609 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-w2lkh_28d145a8-69b6-4cf0-be6b-8bfbd0d2df07/kube-rbac-proxy/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.957265 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-w2lkh_28d145a8-69b6-4cf0-be6b-8bfbd0d2df07/manager/0.log" Sep 29 10:48:26 crc kubenswrapper[4891]: I0929 10:48:26.957434 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-qmk9v_a7ad802e-1b9c-4ab0-a7eb-82932b6f5090/kube-rbac-proxy/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.102754 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-qmk9v_a7ad802e-1b9c-4ab0-a7eb-82932b6f5090/manager/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.156693 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-m5dl2_a04ca278-c2e3-4b48-85f8-16972204c367/kube-rbac-proxy/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.231556 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-m5dl2_a04ca278-c2e3-4b48-85f8-16972204c367/manager/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.332375 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-r4w8j_40dffb60-1139-4864-b251-0aa8c145b66e/kube-rbac-proxy/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.428586 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-r4w8j_40dffb60-1139-4864-b251-0aa8c145b66e/manager/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.545559 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-x2qwd_b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31/manager/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.572324 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-x2qwd_b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31/kube-rbac-proxy/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.709410 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-lwsw4_543e23f1-51b6-489d-91d8-b1550bb69680/kube-rbac-proxy/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.755277 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-lwsw4_543e23f1-51b6-489d-91d8-b1550bb69680/manager/0.log" Sep 29 10:48:27 crc kubenswrapper[4891]: I0929 10:48:27.871073 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-zj5dm_75843062-7193-4953-add3-5859f3dce7de/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.012129 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-zj5dm_75843062-7193-4953-add3-5859f3dce7de/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.039707 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-8xchz_bddd647a-c213-41dd-9f22-3cef16c4622b/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.102199 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-8xchz_bddd647a-c213-41dd-9f22-3cef16c4622b/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.225533 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-2nk4k_6467aac8-0edf-44db-b402-518abc31f6a1/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.288434 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-2nk4k_6467aac8-0edf-44db-b402-518abc31f6a1/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.419024 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-6jllh_9f51bd90-5b61-4cec-875e-d515cc501a22/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.450747 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-6jllh_9f51bd90-5b61-4cec-875e-d515cc501a22/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.480664 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-gxs5h_67ca192a-9f26-47d4-b299-35b0522e9e53/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.614629 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-gxs5h_67ca192a-9f26-47d4-b299-35b0522e9e53/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.671151 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-f7t8j_177d1c2e-3396-4516-aed4-31227f05abff/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.713711 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-f7t8j_177d1c2e-3396-4516-aed4-31227f05abff/manager/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.846993 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hk7zb_910f1b22-b26a-4e74-b716-89b912927374/kube-rbac-proxy/0.log" Sep 29 10:48:28 crc kubenswrapper[4891]: I0929 10:48:28.957994 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hk7zb_910f1b22-b26a-4e74-b716-89b912927374/manager/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.210722 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-t4j4f_348984e7-163d-4396-84f5-319eb4fc79fb/kube-rbac-proxy/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.308927 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-t4j4f_348984e7-163d-4396-84f5-319eb4fc79fb/manager/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.309407 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-f5bh2_51a34f9a-d71a-45d0-9a76-01d629fc7d79/kube-rbac-proxy/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.388740 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-f5bh2_51a34f9a-d71a-45d0-9a76-01d629fc7d79/manager/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.521662 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6475d4f6d5-ckgrz_26216b37-e307-4ecb-ade6-2402d26f32d9/kube-rbac-proxy/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.703961 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-844b5d775b-wwwqn_01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d/kube-rbac-proxy/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.847329 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-844b5d775b-wwwqn_01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d/operator/0.log" Sep 29 10:48:29 crc kubenswrapper[4891]: I0929 10:48:29.856548 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qvghn_dffc134e-8cef-47fa-a97b-08b58fee948c/registry-server/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.011548 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-d45q4_293261f0-9425-4e31-a66d-d8ad8a913228/kube-rbac-proxy/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.180054 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-xlml4_950148f3-aa8c-45bd-9922-6c4e2683d004/kube-rbac-proxy/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.182006 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-d45q4_293261f0-9425-4e31-a66d-d8ad8a913228/manager/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.275498 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-xlml4_950148f3-aa8c-45bd-9922-6c4e2683d004/manager/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.482537 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-8b4x9_31d92d3d-3a46-416c-b5f0-6fb12bb5bead/operator/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.523919 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-9xmvx_48c30870-804a-4f13-95f4-ec4a5a02b536/kube-rbac-proxy/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.560920 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6475d4f6d5-ckgrz_26216b37-e307-4ecb-ade6-2402d26f32d9/manager/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.613303 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-9xmvx_48c30870-804a-4f13-95f4-ec4a5a02b536/manager/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.675645 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-8kqf5_539a685d-4cdf-4344-a7a3-448ec5e9ba6e/kube-rbac-proxy/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.775325 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-8kqf5_539a685d-4cdf-4344-a7a3-448ec5e9ba6e/manager/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.821839 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-r2lgd_59104851-7ccd-446a-9441-ef993caefd10/kube-rbac-proxy/0.log" Sep 29 10:48:30 crc kubenswrapper[4891]: I0929 10:48:30.877753 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-r2lgd_59104851-7ccd-446a-9441-ef993caefd10/manager/0.log" Sep 29 10:48:31 crc kubenswrapper[4891]: I0929 10:48:31.010256 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-q67kx_1ab4abbc-82b1-4624-856b-cbd9062184c0/kube-rbac-proxy/0.log" Sep 29 10:48:31 crc kubenswrapper[4891]: I0929 10:48:31.019022 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-q67kx_1ab4abbc-82b1-4624-856b-cbd9062184c0/manager/0.log" Sep 29 10:48:36 crc kubenswrapper[4891]: I0929 10:48:36.186162 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:48:36 crc kubenswrapper[4891]: I0929 10:48:36.187640 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:48:46 crc kubenswrapper[4891]: I0929 10:48:46.604284 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bspfg_570d72c8-d4ed-4b0a-876a-5a942b32a958/control-plane-machine-set-operator/0.log" Sep 29 10:48:46 crc kubenswrapper[4891]: I0929 10:48:46.776987 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hjlz_fa68a099-1736-4f9a-bcaf-9840257afaeb/kube-rbac-proxy/0.log" Sep 29 10:48:46 crc kubenswrapper[4891]: I0929 10:48:46.818925 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hjlz_fa68a099-1736-4f9a-bcaf-9840257afaeb/machine-api-operator/0.log" Sep 29 10:48:58 crc kubenswrapper[4891]: I0929 10:48:58.205994 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xxtv8_10df5ae8-eb89-4efd-8877-6a87a962fbe7/cert-manager-controller/0.log" Sep 29 10:48:58 crc kubenswrapper[4891]: I0929 10:48:58.363546 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-dtxr6_32062242-b85f-4c38-a6dd-5701216a7a26/cert-manager-cainjector/0.log" Sep 29 10:48:58 crc kubenswrapper[4891]: I0929 10:48:58.409779 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sf94x_b1de57da-fef3-4c24-a501-7f14e9973be9/cert-manager-webhook/0.log" Sep 29 10:49:06 crc kubenswrapper[4891]: I0929 10:49:06.185745 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:49:06 crc kubenswrapper[4891]: I0929 10:49:06.186342 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:49:09 crc kubenswrapper[4891]: I0929 10:49:09.878647 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-dfnnl_33746ad6-4439-446b-bea7-2797ca5a9c37/nmstate-console-plugin/0.log" Sep 29 10:49:10 crc kubenswrapper[4891]: I0929 10:49:10.023679 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hkgzc_393d2298-0458-4346-bfe0-d492fb362511/nmstate-handler/0.log" Sep 29 10:49:10 crc kubenswrapper[4891]: I0929 10:49:10.076401 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7mggl_45398cc7-ef38-4555-befb-ac59051493ed/kube-rbac-proxy/0.log" Sep 29 10:49:10 crc kubenswrapper[4891]: I0929 10:49:10.079483 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7mggl_45398cc7-ef38-4555-befb-ac59051493ed/nmstate-metrics/0.log" Sep 29 10:49:10 crc kubenswrapper[4891]: I0929 10:49:10.220458 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qns5g_08922d5d-a7ec-41c0-9085-bcc17847df78/nmstate-operator/0.log" Sep 29 10:49:10 crc kubenswrapper[4891]: I0929 10:49:10.300014 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-h5mfq_0cde15e7-98b3-44c6-9d10-927909f5f269/nmstate-webhook/0.log" Sep 29 10:49:23 crc kubenswrapper[4891]: I0929 10:49:23.814100 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-2q7w2_105a82c3-b488-41fb-a511-69b3c239dbd2/kube-rbac-proxy/0.log" Sep 29 10:49:23 crc kubenswrapper[4891]: I0929 10:49:23.951528 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-2q7w2_105a82c3-b488-41fb-a511-69b3c239dbd2/controller/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.014750 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.164695 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.173747 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.190442 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.195216 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.391765 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.403870 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.409219 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.409405 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.567911 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.582639 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/controller/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.599367 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.605691 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.799624 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/kube-rbac-proxy/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.805581 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/frr-metrics/0.log" Sep 29 10:49:24 crc kubenswrapper[4891]: I0929 10:49:24.822956 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/kube-rbac-proxy-frr/0.log" Sep 29 10:49:25 crc kubenswrapper[4891]: I0929 10:49:25.051525 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/reloader/0.log" Sep 29 10:49:25 crc kubenswrapper[4891]: I0929 10:49:25.052192 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bb72s_88e3267b-49e6-443d-8cc6-285a983b44ec/frr-k8s-webhook-server/0.log" Sep 29 10:49:25 crc kubenswrapper[4891]: I0929 10:49:25.334735 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-8dcfb8c5d-mbwfg_8a6839a2-c048-442c-a761-c6c1adec39a2/manager/0.log" Sep 29 10:49:25 crc kubenswrapper[4891]: I0929 10:49:25.479435 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-679f568586-f4xqz_4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f/webhook-server/0.log" Sep 29 10:49:25 crc kubenswrapper[4891]: I0929 10:49:25.553760 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wlhcc_eec47c49-2fdd-4eba-aca2-438041840948/kube-rbac-proxy/0.log" Sep 29 10:49:26 crc kubenswrapper[4891]: I0929 10:49:26.095970 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wlhcc_eec47c49-2fdd-4eba-aca2-438041840948/speaker/0.log" Sep 29 10:49:26 crc kubenswrapper[4891]: I0929 10:49:26.296367 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/frr/0.log" Sep 29 10:49:36 crc kubenswrapper[4891]: I0929 10:49:36.186236 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:49:36 crc kubenswrapper[4891]: I0929 10:49:36.186859 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:49:36 crc kubenswrapper[4891]: I0929 10:49:36.186989 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:49:36 crc kubenswrapper[4891]: I0929 10:49:36.188268 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:49:36 crc kubenswrapper[4891]: I0929 10:49:36.188391 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" gracePeriod=600 Sep 29 10:49:36 crc kubenswrapper[4891]: E0929 10:49:36.346869 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.047356 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" exitCode=0 Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.047406 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4"} Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.047449 4891 scope.go:117] "RemoveContainer" containerID="cdab4385752c540734d6236c2923dc60e973c802c1b2f3d4b59745b776b95511" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.048154 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:49:37 crc kubenswrapper[4891]: E0929 10:49:37.048588 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.569137 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.695666 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.696123 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.769370 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.914578 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.927458 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:49:37 crc kubenswrapper[4891]: I0929 10:49:37.974238 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/extract/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.104855 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.245926 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.254468 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.277989 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.434235 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.434281 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.675353 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.868173 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.898893 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.933868 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:49:38 crc kubenswrapper[4891]: I0929 10:49:38.964260 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/registry-server/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.142711 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.159428 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.374184 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.530312 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.534888 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.634912 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.774604 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.776993 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/registry-server/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.806609 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:49:39 crc kubenswrapper[4891]: I0929 10:49:39.838563 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/extract/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.008613 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfmkm_bcfef239-c4e7-43c6-92f3-2092cd28922b/marketplace-operator/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.019263 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.263208 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.276882 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.300244 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.479103 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.571510 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.657697 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/registry-server/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.676179 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.836095 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.857927 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:49:40 crc kubenswrapper[4891]: I0929 10:49:40.871555 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:49:41 crc kubenswrapper[4891]: I0929 10:49:41.016643 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:49:41 crc kubenswrapper[4891]: I0929 10:49:41.020859 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:49:41 crc kubenswrapper[4891]: I0929 10:49:41.581414 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/registry-server/0.log" Sep 29 10:49:49 crc kubenswrapper[4891]: I0929 10:49:49.395832 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:49:49 crc kubenswrapper[4891]: E0929 10:49:49.396636 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:50:01 crc kubenswrapper[4891]: I0929 10:50:01.395849 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:50:01 crc kubenswrapper[4891]: E0929 10:50:01.396621 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:50:15 crc kubenswrapper[4891]: I0929 10:50:15.395766 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:50:15 crc kubenswrapper[4891]: E0929 10:50:15.396501 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:50:30 crc kubenswrapper[4891]: I0929 10:50:30.401441 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:50:30 crc kubenswrapper[4891]: E0929 10:50:30.402305 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:50:45 crc kubenswrapper[4891]: I0929 10:50:45.396736 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:50:45 crc kubenswrapper[4891]: E0929 10:50:45.397473 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:50:57 crc kubenswrapper[4891]: I0929 10:50:57.395749 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:50:57 crc kubenswrapper[4891]: E0929 10:50:57.397690 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:51:10 crc kubenswrapper[4891]: I0929 10:51:10.407123 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:51:10 crc kubenswrapper[4891]: E0929 10:51:10.407966 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:51:21 crc kubenswrapper[4891]: I0929 10:51:21.395609 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:51:21 crc kubenswrapper[4891]: E0929 10:51:21.396859 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:51:34 crc kubenswrapper[4891]: I0929 10:51:34.398998 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:51:34 crc kubenswrapper[4891]: E0929 10:51:34.399929 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:51:36 crc kubenswrapper[4891]: I0929 10:51:36.213003 4891 generic.go:334] "Generic (PLEG): container finished" podID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerID="b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34" exitCode=0 Sep 29 10:51:36 crc kubenswrapper[4891]: I0929 10:51:36.213103 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ssd6w/must-gather-67qkg" event={"ID":"686db550-749c-4c8f-9a0c-a962fe2b07c9","Type":"ContainerDied","Data":"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34"} Sep 29 10:51:36 crc kubenswrapper[4891]: I0929 10:51:36.214347 4891 scope.go:117] "RemoveContainer" containerID="b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34" Sep 29 10:51:36 crc kubenswrapper[4891]: I0929 10:51:36.485917 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ssd6w_must-gather-67qkg_686db550-749c-4c8f-9a0c-a962fe2b07c9/gather/0.log" Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.311536 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ssd6w/must-gather-67qkg"] Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.313659 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ssd6w/must-gather-67qkg" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="copy" containerID="cri-o://eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed" gracePeriod=2 Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.320977 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ssd6w/must-gather-67qkg"] Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.799570 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ssd6w_must-gather-67qkg_686db550-749c-4c8f-9a0c-a962fe2b07c9/copy/0.log" Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.800412 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.980931 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2fm8\" (UniqueName: \"kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8\") pod \"686db550-749c-4c8f-9a0c-a962fe2b07c9\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.980990 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output\") pod \"686db550-749c-4c8f-9a0c-a962fe2b07c9\" (UID: \"686db550-749c-4c8f-9a0c-a962fe2b07c9\") " Sep 29 10:51:44 crc kubenswrapper[4891]: I0929 10:51:44.986366 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8" (OuterVolumeSpecName: "kube-api-access-q2fm8") pod "686db550-749c-4c8f-9a0c-a962fe2b07c9" (UID: "686db550-749c-4c8f-9a0c-a962fe2b07c9"). InnerVolumeSpecName "kube-api-access-q2fm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.084369 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2fm8\" (UniqueName: \"kubernetes.io/projected/686db550-749c-4c8f-9a0c-a962fe2b07c9-kube-api-access-q2fm8\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.150579 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "686db550-749c-4c8f-9a0c-a962fe2b07c9" (UID: "686db550-749c-4c8f-9a0c-a962fe2b07c9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.186054 4891 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/686db550-749c-4c8f-9a0c-a962fe2b07c9-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.300209 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ssd6w_must-gather-67qkg_686db550-749c-4c8f-9a0c-a962fe2b07c9/copy/0.log" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.300582 4891 generic.go:334] "Generic (PLEG): container finished" podID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerID="eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed" exitCode=143 Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.300630 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ssd6w/must-gather-67qkg" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.300641 4891 scope.go:117] "RemoveContainer" containerID="eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.322157 4891 scope.go:117] "RemoveContainer" containerID="b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.374080 4891 scope.go:117] "RemoveContainer" containerID="eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed" Sep 29 10:51:45 crc kubenswrapper[4891]: E0929 10:51:45.374523 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed\": container with ID starting with eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed not found: ID does not exist" containerID="eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.374572 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed"} err="failed to get container status \"eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed\": rpc error: code = NotFound desc = could not find container \"eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed\": container with ID starting with eb9134cec5ab06d54dd532f360e79a441273a4fb8e9e1af98a1b62e72aa1a1ed not found: ID does not exist" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.374593 4891 scope.go:117] "RemoveContainer" containerID="b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34" Sep 29 10:51:45 crc kubenswrapper[4891]: E0929 10:51:45.374953 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34\": container with ID starting with b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34 not found: ID does not exist" containerID="b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34" Sep 29 10:51:45 crc kubenswrapper[4891]: I0929 10:51:45.375005 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34"} err="failed to get container status \"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34\": rpc error: code = NotFound desc = could not find container \"b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34\": container with ID starting with b928b7d55182a2f06d5db7370f15af1e008fc8789c1b9a27056fce6bb981da34 not found: ID does not exist" Sep 29 10:51:46 crc kubenswrapper[4891]: I0929 10:51:46.395852 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:51:46 crc kubenswrapper[4891]: E0929 10:51:46.396337 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:51:46 crc kubenswrapper[4891]: I0929 10:51:46.406105 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" path="/var/lib/kubelet/pods/686db550-749c-4c8f-9a0c-a962fe2b07c9/volumes" Sep 29 10:51:59 crc kubenswrapper[4891]: I0929 10:51:59.396449 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:51:59 crc kubenswrapper[4891]: E0929 10:51:59.397452 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.674286 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:00 crc kubenswrapper[4891]: E0929 10:52:00.674699 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="gather" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.674711 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="gather" Sep 29 10:52:00 crc kubenswrapper[4891]: E0929 10:52:00.674739 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="copy" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.674746 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="copy" Sep 29 10:52:00 crc kubenswrapper[4891]: E0929 10:52:00.674766 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30320c5e-3ad3-476f-ae27-50d306a5a61d" containerName="container-00" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.674775 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="30320c5e-3ad3-476f-ae27-50d306a5a61d" containerName="container-00" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.675040 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="gather" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.675080 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="30320c5e-3ad3-476f-ae27-50d306a5a61d" containerName="container-00" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.675106 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="686db550-749c-4c8f-9a0c-a962fe2b07c9" containerName="copy" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.676363 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.684836 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.735765 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvbb\" (UniqueName: \"kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.736343 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.736484 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.839378 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvbb\" (UniqueName: \"kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.839442 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.839484 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.840109 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.840706 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:00 crc kubenswrapper[4891]: I0929 10:52:00.859238 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvbb\" (UniqueName: \"kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb\") pod \"certified-operators-2xwhk\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:01 crc kubenswrapper[4891]: I0929 10:52:01.007921 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:01 crc kubenswrapper[4891]: I0929 10:52:01.549812 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:02 crc kubenswrapper[4891]: I0929 10:52:02.533759 4891 generic.go:334] "Generic (PLEG): container finished" podID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerID="9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7" exitCode=0 Sep 29 10:52:02 crc kubenswrapper[4891]: I0929 10:52:02.534299 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerDied","Data":"9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7"} Sep 29 10:52:02 crc kubenswrapper[4891]: I0929 10:52:02.534330 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerStarted","Data":"f3dad205b9453f0f7ffc3171759bc24a04cee13240606fd831df975294e9b9ea"} Sep 29 10:52:02 crc kubenswrapper[4891]: I0929 10:52:02.537026 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:52:03 crc kubenswrapper[4891]: I0929 10:52:03.549428 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerStarted","Data":"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf"} Sep 29 10:52:04 crc kubenswrapper[4891]: I0929 10:52:04.561122 4891 generic.go:334] "Generic (PLEG): container finished" podID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerID="1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf" exitCode=0 Sep 29 10:52:04 crc kubenswrapper[4891]: I0929 10:52:04.561203 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerDied","Data":"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf"} Sep 29 10:52:05 crc kubenswrapper[4891]: I0929 10:52:05.578434 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerStarted","Data":"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b"} Sep 29 10:52:05 crc kubenswrapper[4891]: I0929 10:52:05.602591 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2xwhk" podStartSLOduration=3.179823923 podStartE2EDuration="5.602572295s" podCreationTimestamp="2025-09-29 10:52:00 +0000 UTC" firstStartedPulling="2025-09-29 10:52:02.536683429 +0000 UTC m=+3852.741851750" lastFinishedPulling="2025-09-29 10:52:04.959431811 +0000 UTC m=+3855.164600122" observedRunningTime="2025-09-29 10:52:05.601313739 +0000 UTC m=+3855.806482070" watchObservedRunningTime="2025-09-29 10:52:05.602572295 +0000 UTC m=+3855.807740626" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.008229 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.008904 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.060953 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.396196 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:52:11 crc kubenswrapper[4891]: E0929 10:52:11.396669 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.696123 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:11 crc kubenswrapper[4891]: I0929 10:52:11.747959 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:13 crc kubenswrapper[4891]: I0929 10:52:13.648207 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2xwhk" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="registry-server" containerID="cri-o://8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b" gracePeriod=2 Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.119990 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.217920 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities\") pod \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.218165 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content\") pod \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.218269 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvbb\" (UniqueName: \"kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb\") pod \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\" (UID: \"2fdb90e5-73aa-489e-8a31-10c0d0c6875c\") " Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.222570 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities" (OuterVolumeSpecName: "utilities") pod "2fdb90e5-73aa-489e-8a31-10c0d0c6875c" (UID: "2fdb90e5-73aa-489e-8a31-10c0d0c6875c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.225684 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb" (OuterVolumeSpecName: "kube-api-access-bwvbb") pod "2fdb90e5-73aa-489e-8a31-10c0d0c6875c" (UID: "2fdb90e5-73aa-489e-8a31-10c0d0c6875c"). InnerVolumeSpecName "kube-api-access-bwvbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.263870 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fdb90e5-73aa-489e-8a31-10c0d0c6875c" (UID: "2fdb90e5-73aa-489e-8a31-10c0d0c6875c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.320951 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.320980 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvbb\" (UniqueName: \"kubernetes.io/projected/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-kube-api-access-bwvbb\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.320997 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdb90e5-73aa-489e-8a31-10c0d0c6875c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.663718 4891 generic.go:334] "Generic (PLEG): container finished" podID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerID="8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b" exitCode=0 Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.663770 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xwhk" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.663775 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerDied","Data":"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b"} Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.663991 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xwhk" event={"ID":"2fdb90e5-73aa-489e-8a31-10c0d0c6875c","Type":"ContainerDied","Data":"f3dad205b9453f0f7ffc3171759bc24a04cee13240606fd831df975294e9b9ea"} Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.664018 4891 scope.go:117] "RemoveContainer" containerID="8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.696561 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.710703 4891 scope.go:117] "RemoveContainer" containerID="1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.713631 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2xwhk"] Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.752070 4891 scope.go:117] "RemoveContainer" containerID="9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.789524 4891 scope.go:117] "RemoveContainer" containerID="8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b" Sep 29 10:52:14 crc kubenswrapper[4891]: E0929 10:52:14.790033 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b\": container with ID starting with 8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b not found: ID does not exist" containerID="8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.790069 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b"} err="failed to get container status \"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b\": rpc error: code = NotFound desc = could not find container \"8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b\": container with ID starting with 8d846565f9e3984a79a27bf286254c7a8124d6d91edb7a3f70f09d0f45035a2b not found: ID does not exist" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.790099 4891 scope.go:117] "RemoveContainer" containerID="1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf" Sep 29 10:52:14 crc kubenswrapper[4891]: E0929 10:52:14.790503 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf\": container with ID starting with 1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf not found: ID does not exist" containerID="1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.790553 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf"} err="failed to get container status \"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf\": rpc error: code = NotFound desc = could not find container \"1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf\": container with ID starting with 1f7c381869a121d57bd47a34c36fd79604b08def8d8958f2f097ab7d3b9fbfaf not found: ID does not exist" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.790584 4891 scope.go:117] "RemoveContainer" containerID="9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7" Sep 29 10:52:14 crc kubenswrapper[4891]: E0929 10:52:14.791070 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7\": container with ID starting with 9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7 not found: ID does not exist" containerID="9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7" Sep 29 10:52:14 crc kubenswrapper[4891]: I0929 10:52:14.791127 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7"} err="failed to get container status \"9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7\": rpc error: code = NotFound desc = could not find container \"9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7\": container with ID starting with 9f04e6693f06c91a7a2fa02b705d9e3ec37812444fdd022afea0916533e138f7 not found: ID does not exist" Sep 29 10:52:16 crc kubenswrapper[4891]: I0929 10:52:16.417601 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" path="/var/lib/kubelet/pods/2fdb90e5-73aa-489e-8a31-10c0d0c6875c/volumes" Sep 29 10:52:18 crc kubenswrapper[4891]: I0929 10:52:18.600725 4891 scope.go:117] "RemoveContainer" containerID="7e549e280d66f2dcb85cc161562e5667b689c98e4c13af30cbe84300b384299a" Sep 29 10:52:24 crc kubenswrapper[4891]: I0929 10:52:24.396447 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:52:24 crc kubenswrapper[4891]: E0929 10:52:24.397421 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.454710 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9ww5r/must-gather-rt8g7"] Sep 29 10:52:34 crc kubenswrapper[4891]: E0929 10:52:34.455689 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="extract-utilities" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.455703 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="extract-utilities" Sep 29 10:52:34 crc kubenswrapper[4891]: E0929 10:52:34.455724 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="extract-content" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.455730 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="extract-content" Sep 29 10:52:34 crc kubenswrapper[4891]: E0929 10:52:34.455739 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="registry-server" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.455745 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="registry-server" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.456009 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fdb90e5-73aa-489e-8a31-10c0d0c6875c" containerName="registry-server" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.457357 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.459275 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9ww5r"/"kube-root-ca.crt" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.459363 4891 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9ww5r"/"default-dockercfg-8qtbs" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.459281 4891 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9ww5r"/"openshift-service-ca.crt" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.464786 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9ww5r/must-gather-rt8g7"] Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.551701 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.551755 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f2kl\" (UniqueName: \"kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.654015 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.654071 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f2kl\" (UniqueName: \"kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.654894 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.671994 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f2kl\" (UniqueName: \"kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl\") pod \"must-gather-rt8g7\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:34 crc kubenswrapper[4891]: I0929 10:52:34.776285 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:52:35 crc kubenswrapper[4891]: I0929 10:52:35.255145 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9ww5r/must-gather-rt8g7"] Sep 29 10:52:35 crc kubenswrapper[4891]: I0929 10:52:35.920336 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" event={"ID":"f5c7b77f-4cad-42de-be3d-cba2ad258c6c","Type":"ContainerStarted","Data":"05b463472d2fba6c06a5111ca4bb0e542a5a6c9cad188d589818fa0cf363e92e"} Sep 29 10:52:35 crc kubenswrapper[4891]: I0929 10:52:35.920726 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" event={"ID":"f5c7b77f-4cad-42de-be3d-cba2ad258c6c","Type":"ContainerStarted","Data":"7491fd57c1e0c95191b72a343f7ec9434e0acf4fd1f2847c9b52c17bfb48f1c9"} Sep 29 10:52:35 crc kubenswrapper[4891]: I0929 10:52:35.920756 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" event={"ID":"f5c7b77f-4cad-42de-be3d-cba2ad258c6c","Type":"ContainerStarted","Data":"c89c3f86bcb26e07e14be9826cc709b854fbb420f3cd1268c28b647a29b78396"} Sep 29 10:52:35 crc kubenswrapper[4891]: I0929 10:52:35.938517 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" podStartSLOduration=1.938497932 podStartE2EDuration="1.938497932s" podCreationTimestamp="2025-09-29 10:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:52:35.937617776 +0000 UTC m=+3886.142786107" watchObservedRunningTime="2025-09-29 10:52:35.938497932 +0000 UTC m=+3886.143666293" Sep 29 10:52:38 crc kubenswrapper[4891]: I0929 10:52:38.965138 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-wwxk6"] Sep 29 10:52:38 crc kubenswrapper[4891]: I0929 10:52:38.971093 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.147870 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6gm\" (UniqueName: \"kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.148500 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.250064 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6gm\" (UniqueName: \"kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.250211 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.250390 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.274383 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6gm\" (UniqueName: \"kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm\") pod \"crc-debug-wwxk6\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.292055 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:52:39 crc kubenswrapper[4891]: W0929 10:52:39.319677 4891 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f16c6f6_9b1c_4bea_a66e_8bdc678ca682.slice/crio-f4418c1f027dd61abd792614f72b0005c10345c57de34d2927a03d0026805265 WatchSource:0}: Error finding container f4418c1f027dd61abd792614f72b0005c10345c57de34d2927a03d0026805265: Status 404 returned error can't find the container with id f4418c1f027dd61abd792614f72b0005c10345c57de34d2927a03d0026805265 Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.395725 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:52:39 crc kubenswrapper[4891]: E0929 10:52:39.396196 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.956162 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" event={"ID":"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682","Type":"ContainerStarted","Data":"71c31e21cc80ce015ae216f6f52b7093b045f1e91daee101c63e77b0119fca0e"} Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.956870 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" event={"ID":"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682","Type":"ContainerStarted","Data":"f4418c1f027dd61abd792614f72b0005c10345c57de34d2927a03d0026805265"} Sep 29 10:52:39 crc kubenswrapper[4891]: I0929 10:52:39.978813 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" podStartSLOduration=1.978798475 podStartE2EDuration="1.978798475s" podCreationTimestamp="2025-09-29 10:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:52:39.972627406 +0000 UTC m=+3890.177795737" watchObservedRunningTime="2025-09-29 10:52:39.978798475 +0000 UTC m=+3890.183966796" Sep 29 10:52:54 crc kubenswrapper[4891]: I0929 10:52:54.396005 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:52:54 crc kubenswrapper[4891]: E0929 10:52:54.397727 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:53:06 crc kubenswrapper[4891]: I0929 10:53:06.398017 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:53:06 crc kubenswrapper[4891]: E0929 10:53:06.399044 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:53:19 crc kubenswrapper[4891]: I0929 10:53:19.396245 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:53:19 crc kubenswrapper[4891]: E0929 10:53:19.396944 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:53:30 crc kubenswrapper[4891]: I0929 10:53:30.403657 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:53:30 crc kubenswrapper[4891]: E0929 10:53:30.407136 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:53:38 crc kubenswrapper[4891]: I0929 10:53:38.741015 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff9945478-9v77b_7efda6e6-9019-4909-96be-068496b2577f/barbican-api/0.log" Sep 29 10:53:38 crc kubenswrapper[4891]: I0929 10:53:38.747948 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff9945478-9v77b_7efda6e6-9019-4909-96be-068496b2577f/barbican-api-log/0.log" Sep 29 10:53:38 crc kubenswrapper[4891]: I0929 10:53:38.930487 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-748f5656b6-pdpff_d8d04caa-6db6-41c2-bf9b-f5ed373e9799/barbican-keystone-listener-log/0.log" Sep 29 10:53:38 crc kubenswrapper[4891]: I0929 10:53:38.932008 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-748f5656b6-pdpff_d8d04caa-6db6-41c2-bf9b-f5ed373e9799/barbican-keystone-listener/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.091928 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-677dd7cdbc-drpcv_e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99/barbican-worker/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.126436 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-677dd7cdbc-drpcv_e6e57a54-3d8a-4ec7-8f8a-d8d049d4ed99/barbican-worker-log/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.323321 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6shb4_9c677d7a-2716-4c8d-8d87-7c158ca5de6c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.500164 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/ceilometer-central-agent/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.519590 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/proxy-httpd/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.536635 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/ceilometer-notification-agent/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.662718 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7f919b2-f640-44b6-83f9-f870057ba63a/sg-core/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.760146 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_77aa5ca3-797d-4f00-8f2d-d735b77d9965/cinder-api/0.log" Sep 29 10:53:39 crc kubenswrapper[4891]: I0929 10:53:39.905404 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_77aa5ca3-797d-4f00-8f2d-d735b77d9965/cinder-api-log/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.049166 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_67d0b646-e147-42d3-8ef9-9001b2b24313/cinder-scheduler/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.179867 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_67d0b646-e147-42d3-8ef9-9001b2b24313/probe/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.288507 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zzm8t_677e5a8c-37d1-41a1-bd47-2ef7af3a3570/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.394732 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fr2jl_ec82e6b1-2bf2-41fe-8a5f-5825d0d8ffa3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.564145 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/init/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.708490 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/init/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.761393 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-4jnvs_e33fd450-15eb-4135-bfb6-c42df60defa6/dnsmasq-dns/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.907866 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b5dzr_bd3f0561-9568-4116-b84e-1209c964e50f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:40 crc kubenswrapper[4891]: I0929 10:53:40.990062 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ca02e873-8e2c-4958-a757-92efa57fdea8/glance-httpd/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.125926 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ca02e873-8e2c-4958-a757-92efa57fdea8/glance-log/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.233864 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6e7444d-97cc-440f-92de-e9db5ff440b5/glance-httpd/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.351229 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6e7444d-97cc-440f-92de-e9db5ff440b5/glance-log/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.552244 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d8cd8ff44-d8rc8_d464aff7-6448-4eaf-b88e-01a8acc3e42a/horizon/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.788394 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-n4b9k_81439ac0-9a3d-434f-8122-90cc5eeeba97/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:41 crc kubenswrapper[4891]: I0929 10:53:41.928764 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d8cd8ff44-d8rc8_d464aff7-6448-4eaf-b88e-01a8acc3e42a/horizon-log/0.log" Sep 29 10:53:42 crc kubenswrapper[4891]: I0929 10:53:42.153687 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6kwtc_b33262be-68ab-40c1-a34e-c629096460a8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:42 crc kubenswrapper[4891]: I0929 10:53:42.409841 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_34c908de-54eb-4e12-a9a9-735fbf07c433/kube-state-metrics/0.log" Sep 29 10:53:42 crc kubenswrapper[4891]: I0929 10:53:42.414483 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b9ccf6696-q5jmg_b4050314-008a-4b46-93e7-2d9454fa3d89/keystone-api/0.log" Sep 29 10:53:42 crc kubenswrapper[4891]: I0929 10:53:42.650849 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-l85z5_ed5239b3-e586-4a56-89ce-74977f3509db/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.004075 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56d6cd75c7-6j75x_af72b6bb-1073-4ceb-b593-209e646bba5a/neutron-api/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.046150 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56d6cd75c7-6j75x_af72b6bb-1073-4ceb-b593-209e646bba5a/neutron-httpd/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.191455 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wfkj7_cc623a81-2fe0-42a2-8f61-cb9ab6909984/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.733825 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae3bbc79-5ed8-4064-bc90-554ca707171b/nova-api-log/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.946406 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_01cf2585-dd79-4154-8567-2c24dee11709/nova-cell0-conductor-conductor/0.log" Sep 29 10:53:43 crc kubenswrapper[4891]: I0929 10:53:43.965714 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae3bbc79-5ed8-4064-bc90-554ca707171b/nova-api-api/0.log" Sep 29 10:53:44 crc kubenswrapper[4891]: I0929 10:53:44.357118 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_854cde66-3c80-472d-b232-45231eef0bbd/nova-cell1-conductor-conductor/0.log" Sep 29 10:53:44 crc kubenswrapper[4891]: I0929 10:53:44.375088 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2e574b8c-0b11-4d63-a842-239dbbf69258/nova-cell1-novncproxy-novncproxy/0.log" Sep 29 10:53:44 crc kubenswrapper[4891]: I0929 10:53:44.587908 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nbfms_9257358d-6c0c-43ba-831e-c68505df09d8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:44 crc kubenswrapper[4891]: I0929 10:53:44.653163 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17f1cdf8-c8a7-42b7-a864-e89db1b08cb7/nova-metadata-log/0.log" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.218440 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_067392b2-a609-44b9-8796-26df77b11d9e/nova-scheduler-scheduler/0.log" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.396125 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:53:45 crc kubenswrapper[4891]: E0929 10:53:45.396475 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.421005 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/mysql-bootstrap/0.log" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.594344 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/mysql-bootstrap/0.log" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.627513 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71a06463-0c24-4e9a-a4e7-4b0143207f46/galera/0.log" Sep 29 10:53:45 crc kubenswrapper[4891]: I0929 10:53:45.921486 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/mysql-bootstrap/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.055172 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/mysql-bootstrap/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.146880 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_26add200-3f00-406b-8d30-565e1e51fbd3/galera/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.263345 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17f1cdf8-c8a7-42b7-a864-e89db1b08cb7/nova-metadata-metadata/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.367007 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d3e0b825-0c6a-49ed-bb87-097ab0e686ee/openstackclient/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.573626 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jq4xk_7484acb7-f4b2-417b-a478-86b8c5999c34/ovn-controller/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.686613 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-czcgf_aeab10b4-2f08-4eed-88eb-ba6f26db6cd0/openstack-network-exporter/0.log" Sep 29 10:53:46 crc kubenswrapper[4891]: I0929 10:53:46.910671 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server-init/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.040837 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server-init/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.093784 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovsdb-server/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.098406 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vxc7p_c1fc0a48-e2b3-479b-948c-ff2279a7205c/ovs-vswitchd/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.290247 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-frbbc_1dc04648-883f-4273-bf36-d550e5caba61/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.486347 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_942ef260-597a-42db-9123-1e9e0b1c4e1b/openstack-network-exporter/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.512079 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_942ef260-597a-42db-9123-1e9e0b1c4e1b/ovn-northd/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.717592 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_807cd996-3d20-4f16-b5bb-3b4e4da82775/openstack-network-exporter/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.761396 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_807cd996-3d20-4f16-b5bb-3b4e4da82775/ovsdbserver-nb/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.933710 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c6033a7a-34ac-409d-ab81-035b291364aa/openstack-network-exporter/0.log" Sep 29 10:53:47 crc kubenswrapper[4891]: I0929 10:53:47.960103 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c6033a7a-34ac-409d-ab81-035b291364aa/ovsdbserver-sb/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.170524 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-696f7ffc96-xhjxt_e8ec980b-adab-4378-a632-0de5186250dd/placement-api/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.243960 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-696f7ffc96-xhjxt_e8ec980b-adab-4378-a632-0de5186250dd/placement-log/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.343266 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/setup-container/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.573096 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/setup-container/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.640203 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cdea1706-076e-4094-a87b-d79580a81fcd/rabbitmq/0.log" Sep 29 10:53:48 crc kubenswrapper[4891]: I0929 10:53:48.854056 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/setup-container/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.053685 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/rabbitmq/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.185998 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e1fd90ef-a2cb-4931-ba68-9e302e943c2b/setup-container/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.336215 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lb5p5_cd0ca11c-98f5-4734-bc9a-fef72b1004f8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.453378 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6t8ns_d51c52c6-2e99-465a-9654-c58b12dd213e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.603055 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dk89d_6398fd57-3f6f-4c01-98da-81f0ad16c4a6/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.758527 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2zlpr_92316376-b91d-4e78-ac0c-6f03f1be5f26/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:49 crc kubenswrapper[4891]: I0929 10:53:49.936132 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-68vb5_e5a83354-1dda-4488-a048-16ac1b5f36f5/ssh-known-hosts-edpm-deployment/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.190594 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df6f5468c-2kcvk_82a9d505-81c4-410a-9707-adb83f47f425/proxy-server/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.258967 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df6f5468c-2kcvk_82a9d505-81c4-410a-9707-adb83f47f425/proxy-httpd/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.434078 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fnxwg_becd282d-9d1a-4bf8-8e48-cdbab75047e1/swift-ring-rebalance/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.485491 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-auditor/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.580815 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-reaper/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.690761 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-replicator/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.709534 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/account-server/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.761394 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-auditor/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.892583 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-server/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.923168 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-replicator/0.log" Sep 29 10:53:50 crc kubenswrapper[4891]: I0929 10:53:50.945661 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/container-updater/0.log" Sep 29 10:53:51 crc kubenswrapper[4891]: I0929 10:53:51.691549 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-auditor/0.log" Sep 29 10:53:51 crc kubenswrapper[4891]: I0929 10:53:51.790050 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-expirer/0.log" Sep 29 10:53:51 crc kubenswrapper[4891]: I0929 10:53:51.909398 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-replicator/0.log" Sep 29 10:53:51 crc kubenswrapper[4891]: I0929 10:53:51.976543 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-server/0.log" Sep 29 10:53:52 crc kubenswrapper[4891]: I0929 10:53:52.016391 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/object-updater/0.log" Sep 29 10:53:52 crc kubenswrapper[4891]: I0929 10:53:52.154655 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/rsync/0.log" Sep 29 10:53:52 crc kubenswrapper[4891]: I0929 10:53:52.233994 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_66ec83ce-b4c6-412b-b7c4-6a61c6914c0e/swift-recon-cron/0.log" Sep 29 10:53:52 crc kubenswrapper[4891]: I0929 10:53:52.436538 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mbnfg_451c7a1c-dd37-464d-b2c8-7924f1882509/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:52 crc kubenswrapper[4891]: I0929 10:53:52.570948 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ae98e843-bdec-443e-8389-9a58c187f5bd/tempest-tests-tempest-tests-runner/0.log" Sep 29 10:53:53 crc kubenswrapper[4891]: I0929 10:53:53.163999 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wrjfs_a4406439-b507-4572-b458-58d0ddf2b94d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:53:53 crc kubenswrapper[4891]: I0929 10:53:53.390029 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0ae22834-3106-47d5-a04c-0ab9327991df/test-operator-logs-container/0.log" Sep 29 10:54:00 crc kubenswrapper[4891]: I0929 10:54:00.408798 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:54:00 crc kubenswrapper[4891]: E0929 10:54:00.409524 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:54:02 crc kubenswrapper[4891]: I0929 10:54:02.152093 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_9ec260f8-616d-4e46-8685-0dcabdf10a16/memcached/0.log" Sep 29 10:54:13 crc kubenswrapper[4891]: I0929 10:54:13.397523 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:54:13 crc kubenswrapper[4891]: E0929 10:54:13.398532 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:54:18 crc kubenswrapper[4891]: I0929 10:54:18.720436 4891 scope.go:117] "RemoveContainer" containerID="5befeac4ae7041e4c33b93beb0d10eb906e538577fe32b4f025f9d2f72aaffac" Sep 29 10:54:24 crc kubenswrapper[4891]: I0929 10:54:24.911432 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" event={"ID":"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682","Type":"ContainerDied","Data":"71c31e21cc80ce015ae216f6f52b7093b045f1e91daee101c63e77b0119fca0e"} Sep 29 10:54:24 crc kubenswrapper[4891]: I0929 10:54:24.910679 4891 generic.go:334] "Generic (PLEG): container finished" podID="9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" containerID="71c31e21cc80ce015ae216f6f52b7093b045f1e91daee101c63e77b0119fca0e" exitCode=0 Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.051299 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.100805 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-wwxk6"] Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.114086 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-wwxk6"] Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.165446 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q6gm\" (UniqueName: \"kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm\") pod \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.165719 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host\") pod \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\" (UID: \"9f16c6f6-9b1c-4bea-a66e-8bdc678ca682\") " Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.165986 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host" (OuterVolumeSpecName: "host") pod "9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" (UID: "9f16c6f6-9b1c-4bea-a66e-8bdc678ca682"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.166772 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.171202 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm" (OuterVolumeSpecName: "kube-api-access-7q6gm") pod "9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" (UID: "9f16c6f6-9b1c-4bea-a66e-8bdc678ca682"). InnerVolumeSpecName "kube-api-access-7q6gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.268451 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q6gm\" (UniqueName: \"kubernetes.io/projected/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682-kube-api-access-7q6gm\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.396388 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:54:26 crc kubenswrapper[4891]: E0929 10:54:26.396850 4891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gb8tp_openshift-machine-config-operator(582de198-5a15-4c4c-aaea-881c638a42ac)\"" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.409022 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" path="/var/lib/kubelet/pods/9f16c6f6-9b1c-4bea-a66e-8bdc678ca682/volumes" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.941245 4891 scope.go:117] "RemoveContainer" containerID="71c31e21cc80ce015ae216f6f52b7093b045f1e91daee101c63e77b0119fca0e" Sep 29 10:54:26 crc kubenswrapper[4891]: I0929 10:54:26.941306 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-wwxk6" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.308865 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-lr6t7"] Sep 29 10:54:27 crc kubenswrapper[4891]: E0929 10:54:27.309448 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" containerName="container-00" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.309470 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" containerName="container-00" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.309877 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f16c6f6-9b1c-4bea-a66e-8bdc678ca682" containerName="container-00" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.310786 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.387417 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smpzm\" (UniqueName: \"kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.387744 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.489639 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smpzm\" (UniqueName: \"kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.490043 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.490307 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.525246 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smpzm\" (UniqueName: \"kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm\") pod \"crc-debug-lr6t7\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.637483 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.956054 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" event={"ID":"5836853e-b710-4e27-94a6-3eb75e6de514","Type":"ContainerStarted","Data":"0bab063643ab45bb495d65617f25c0de1f1de13e9ba148fa71f7d9b96c3b4a55"} Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.956119 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" event={"ID":"5836853e-b710-4e27-94a6-3eb75e6de514","Type":"ContainerStarted","Data":"1fe5c803c23616038c68c3df9babcaaaee81cc808122f5323a1474f04968e630"} Sep 29 10:54:27 crc kubenswrapper[4891]: I0929 10:54:27.974917 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" podStartSLOduration=0.974899798 podStartE2EDuration="974.899798ms" podCreationTimestamp="2025-09-29 10:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:54:27.970916133 +0000 UTC m=+3998.176084474" watchObservedRunningTime="2025-09-29 10:54:27.974899798 +0000 UTC m=+3998.180068119" Sep 29 10:54:28 crc kubenswrapper[4891]: I0929 10:54:28.965633 4891 generic.go:334] "Generic (PLEG): container finished" podID="5836853e-b710-4e27-94a6-3eb75e6de514" containerID="0bab063643ab45bb495d65617f25c0de1f1de13e9ba148fa71f7d9b96c3b4a55" exitCode=0 Sep 29 10:54:28 crc kubenswrapper[4891]: I0929 10:54:28.965673 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" event={"ID":"5836853e-b710-4e27-94a6-3eb75e6de514","Type":"ContainerDied","Data":"0bab063643ab45bb495d65617f25c0de1f1de13e9ba148fa71f7d9b96c3b4a55"} Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.081155 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.132384 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smpzm\" (UniqueName: \"kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm\") pod \"5836853e-b710-4e27-94a6-3eb75e6de514\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.132416 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host\") pod \"5836853e-b710-4e27-94a6-3eb75e6de514\" (UID: \"5836853e-b710-4e27-94a6-3eb75e6de514\") " Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.132847 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host" (OuterVolumeSpecName: "host") pod "5836853e-b710-4e27-94a6-3eb75e6de514" (UID: "5836853e-b710-4e27-94a6-3eb75e6de514"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.133150 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5836853e-b710-4e27-94a6-3eb75e6de514-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.150553 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm" (OuterVolumeSpecName: "kube-api-access-smpzm") pod "5836853e-b710-4e27-94a6-3eb75e6de514" (UID: "5836853e-b710-4e27-94a6-3eb75e6de514"). InnerVolumeSpecName "kube-api-access-smpzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.234429 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smpzm\" (UniqueName: \"kubernetes.io/projected/5836853e-b710-4e27-94a6-3eb75e6de514-kube-api-access-smpzm\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.986844 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" event={"ID":"5836853e-b710-4e27-94a6-3eb75e6de514","Type":"ContainerDied","Data":"1fe5c803c23616038c68c3df9babcaaaee81cc808122f5323a1474f04968e630"} Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.987173 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fe5c803c23616038c68c3df9babcaaaee81cc808122f5323a1474f04968e630" Sep 29 10:54:30 crc kubenswrapper[4891]: I0929 10:54:30.986892 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-lr6t7" Sep 29 10:54:34 crc kubenswrapper[4891]: I0929 10:54:34.832447 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-lr6t7"] Sep 29 10:54:34 crc kubenswrapper[4891]: I0929 10:54:34.839673 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-lr6t7"] Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.051260 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-jsrff"] Sep 29 10:54:36 crc kubenswrapper[4891]: E0929 10:54:36.051630 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5836853e-b710-4e27-94a6-3eb75e6de514" containerName="container-00" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.051645 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="5836853e-b710-4e27-94a6-3eb75e6de514" containerName="container-00" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.051866 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="5836853e-b710-4e27-94a6-3eb75e6de514" containerName="container-00" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.052432 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.118963 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9dj\" (UniqueName: \"kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.119044 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.221712 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.221832 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.222123 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9dj\" (UniqueName: \"kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.251643 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9dj\" (UniqueName: \"kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj\") pod \"crc-debug-jsrff\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.391562 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:36 crc kubenswrapper[4891]: I0929 10:54:36.412922 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5836853e-b710-4e27-94a6-3eb75e6de514" path="/var/lib/kubelet/pods/5836853e-b710-4e27-94a6-3eb75e6de514/volumes" Sep 29 10:54:37 crc kubenswrapper[4891]: I0929 10:54:37.050741 4891 generic.go:334] "Generic (PLEG): container finished" podID="7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" containerID="683d9caff39c258f45799298d36d0377dc02619714aa69218131d6bbc7fbd118" exitCode=0 Sep 29 10:54:37 crc kubenswrapper[4891]: I0929 10:54:37.050881 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" event={"ID":"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb","Type":"ContainerDied","Data":"683d9caff39c258f45799298d36d0377dc02619714aa69218131d6bbc7fbd118"} Sep 29 10:54:37 crc kubenswrapper[4891]: I0929 10:54:37.050966 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" event={"ID":"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb","Type":"ContainerStarted","Data":"6d0908ff154ae77b01476e164f0122b2cd951c6fe99dd647493aa0d70fe96c7d"} Sep 29 10:54:37 crc kubenswrapper[4891]: I0929 10:54:37.115600 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-jsrff"] Sep 29 10:54:37 crc kubenswrapper[4891]: I0929 10:54:37.129602 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9ww5r/crc-debug-jsrff"] Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.157810 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.256511 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh9dj\" (UniqueName: \"kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj\") pod \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.256637 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host\") pod \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\" (UID: \"7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb\") " Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.256813 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host" (OuterVolumeSpecName: "host") pod "7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" (UID: "7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.257444 4891 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.264762 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj" (OuterVolumeSpecName: "kube-api-access-rh9dj") pod "7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" (UID: "7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb"). InnerVolumeSpecName "kube-api-access-rh9dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.358921 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh9dj\" (UniqueName: \"kubernetes.io/projected/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb-kube-api-access-rh9dj\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.405403 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" path="/var/lib/kubelet/pods/7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb/volumes" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.701009 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.858900 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.874040 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:54:38 crc kubenswrapper[4891]: I0929 10:54:38.896317 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.037289 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/pull/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.037472 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/util/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.040076 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7249b6fb83b4fe395da43820d479bae1b9657a4e4bbebce28bf59b65fdrkvvj_47086f2a-3e89-4170-9f19-5bfd1d07c1ff/extract/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.068031 4891 scope.go:117] "RemoveContainer" containerID="683d9caff39c258f45799298d36d0377dc02619714aa69218131d6bbc7fbd118" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.068050 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/crc-debug-jsrff" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.250005 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-w2lkh_28d145a8-69b6-4cf0-be6b-8bfbd0d2df07/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.287624 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-w2lkh_28d145a8-69b6-4cf0-be6b-8bfbd0d2df07/manager/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.322212 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-qmk9v_a7ad802e-1b9c-4ab0-a7eb-82932b6f5090/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.442333 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-m5dl2_a04ca278-c2e3-4b48-85f8-16972204c367/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.444395 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-qmk9v_a7ad802e-1b9c-4ab0-a7eb-82932b6f5090/manager/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.510594 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-m5dl2_a04ca278-c2e3-4b48-85f8-16972204c367/manager/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.612628 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-r4w8j_40dffb60-1139-4864-b251-0aa8c145b66e/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.701568 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-r4w8j_40dffb60-1139-4864-b251-0aa8c145b66e/manager/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.811061 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-x2qwd_b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31/manager/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.811548 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-x2qwd_b1ce187f-22c9-47c7-9f8b-e8d4b6c2aa31/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.894189 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-lwsw4_543e23f1-51b6-489d-91d8-b1550bb69680/kube-rbac-proxy/0.log" Sep 29 10:54:39 crc kubenswrapper[4891]: I0929 10:54:39.991115 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-lwsw4_543e23f1-51b6-489d-91d8-b1550bb69680/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.112085 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-zj5dm_75843062-7193-4953-add3-5859f3dce7de/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.188371 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-zj5dm_75843062-7193-4953-add3-5859f3dce7de/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.218248 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-8xchz_bddd647a-c213-41dd-9f22-3cef16c4622b/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.342065 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-2nk4k_6467aac8-0edf-44db-b402-518abc31f6a1/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.350980 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-8xchz_bddd647a-c213-41dd-9f22-3cef16c4622b/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.440942 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-2nk4k_6467aac8-0edf-44db-b402-518abc31f6a1/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.545467 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-6jllh_9f51bd90-5b61-4cec-875e-d515cc501a22/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.596803 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-6jllh_9f51bd90-5b61-4cec-875e-d515cc501a22/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.690381 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-gxs5h_67ca192a-9f26-47d4-b299-35b0522e9e53/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.742249 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-gxs5h_67ca192a-9f26-47d4-b299-35b0522e9e53/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.866901 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-f7t8j_177d1c2e-3396-4516-aed4-31227f05abff/kube-rbac-proxy/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.951301 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-f7t8j_177d1c2e-3396-4516-aed4-31227f05abff/manager/0.log" Sep 29 10:54:40 crc kubenswrapper[4891]: I0929 10:54:40.985455 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hk7zb_910f1b22-b26a-4e74-b716-89b912927374/kube-rbac-proxy/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.028168 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:41 crc kubenswrapper[4891]: E0929 10:54:41.028573 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" containerName="container-00" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.028587 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" containerName="container-00" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.028765 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4cf869-b3a6-471b-9e09-5ee4d3cc6fdb" containerName="container-00" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.030087 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.044598 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.124581 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22vk\" (UniqueName: \"kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.124735 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.124817 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.174845 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hk7zb_910f1b22-b26a-4e74-b716-89b912927374/manager/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.226637 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.226706 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.226758 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22vk\" (UniqueName: \"kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.227195 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.227409 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.247177 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22vk\" (UniqueName: \"kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk\") pod \"redhat-marketplace-fxsmx\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.290025 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-t4j4f_348984e7-163d-4396-84f5-319eb4fc79fb/kube-rbac-proxy/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.321221 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-t4j4f_348984e7-163d-4396-84f5-319eb4fc79fb/manager/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.362753 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.396315 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.504800 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-f5bh2_51a34f9a-d71a-45d0-9a76-01d629fc7d79/kube-rbac-proxy/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.542463 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-f5bh2_51a34f9a-d71a-45d0-9a76-01d629fc7d79/manager/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.727446 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6475d4f6d5-ckgrz_26216b37-e307-4ecb-ade6-2402d26f32d9/kube-rbac-proxy/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.854634 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.936697 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-844b5d775b-wwwqn_01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d/kube-rbac-proxy/0.log" Sep 29 10:54:41 crc kubenswrapper[4891]: I0929 10:54:41.979569 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qvghn_dffc134e-8cef-47fa-a97b-08b58fee948c/registry-server/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.111232 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"2da827005486a0a15bbdd7c43a77ee70ba395e814d8c72e53cadb7aaabd42575"} Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.121622 4891 generic.go:334] "Generic (PLEG): container finished" podID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerID="ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a" exitCode=0 Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.121665 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerDied","Data":"ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a"} Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.121692 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerStarted","Data":"2b5bfb0ec13f2cc32d21c1b9adac68d5ab34e8e21930c49c4bbdadc5c03be8ca"} Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.124503 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-844b5d775b-wwwqn_01b68a39-0d11-4f7f-b7c8-b2e50dae7e2d/operator/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.238601 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-d45q4_293261f0-9425-4e31-a66d-d8ad8a913228/kube-rbac-proxy/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.390740 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-d45q4_293261f0-9425-4e31-a66d-d8ad8a913228/manager/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.437190 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-xlml4_950148f3-aa8c-45bd-9922-6c4e2683d004/kube-rbac-proxy/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.456654 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-xlml4_950148f3-aa8c-45bd-9922-6c4e2683d004/manager/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.659676 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-8b4x9_31d92d3d-3a46-416c-b5f0-6fb12bb5bead/operator/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.731063 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-9xmvx_48c30870-804a-4f13-95f4-ec4a5a02b536/kube-rbac-proxy/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.856657 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-9xmvx_48c30870-804a-4f13-95f4-ec4a5a02b536/manager/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.868888 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6475d4f6d5-ckgrz_26216b37-e307-4ecb-ade6-2402d26f32d9/manager/0.log" Sep 29 10:54:42 crc kubenswrapper[4891]: I0929 10:54:42.927510 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-8kqf5_539a685d-4cdf-4344-a7a3-448ec5e9ba6e/kube-rbac-proxy/0.log" Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.055205 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-r2lgd_59104851-7ccd-446a-9441-ef993caefd10/kube-rbac-proxy/0.log" Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.062154 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-8kqf5_539a685d-4cdf-4344-a7a3-448ec5e9ba6e/manager/0.log" Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.131521 4891 generic.go:334] "Generic (PLEG): container finished" podID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerID="5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0" exitCode=0 Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.131560 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerDied","Data":"5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0"} Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.144528 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-r2lgd_59104851-7ccd-446a-9441-ef993caefd10/manager/0.log" Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.233446 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-q67kx_1ab4abbc-82b1-4624-856b-cbd9062184c0/kube-rbac-proxy/0.log" Sep 29 10:54:43 crc kubenswrapper[4891]: I0929 10:54:43.240750 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-q67kx_1ab4abbc-82b1-4624-856b-cbd9062184c0/manager/0.log" Sep 29 10:54:44 crc kubenswrapper[4891]: I0929 10:54:44.141202 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerStarted","Data":"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e"} Sep 29 10:54:44 crc kubenswrapper[4891]: I0929 10:54:44.158928 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxsmx" podStartSLOduration=1.699813281 podStartE2EDuration="3.15891365s" podCreationTimestamp="2025-09-29 10:54:41 +0000 UTC" firstStartedPulling="2025-09-29 10:54:42.123912174 +0000 UTC m=+4012.329080495" lastFinishedPulling="2025-09-29 10:54:43.583012533 +0000 UTC m=+4013.788180864" observedRunningTime="2025-09-29 10:54:44.154824513 +0000 UTC m=+4014.359992834" watchObservedRunningTime="2025-09-29 10:54:44.15891365 +0000 UTC m=+4014.364081981" Sep 29 10:54:51 crc kubenswrapper[4891]: I0929 10:54:51.364720 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:51 crc kubenswrapper[4891]: I0929 10:54:51.365537 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:51 crc kubenswrapper[4891]: I0929 10:54:51.436721 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:52 crc kubenswrapper[4891]: I0929 10:54:52.250289 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:52 crc kubenswrapper[4891]: I0929 10:54:52.323353 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.225359 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxsmx" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="registry-server" containerID="cri-o://5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e" gracePeriod=2 Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.689405 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.783578 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities\") pod \"15781540-68d3-4eb4-8966-eff30e2ebd33\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.783974 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22vk\" (UniqueName: \"kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk\") pod \"15781540-68d3-4eb4-8966-eff30e2ebd33\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.784173 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content\") pod \"15781540-68d3-4eb4-8966-eff30e2ebd33\" (UID: \"15781540-68d3-4eb4-8966-eff30e2ebd33\") " Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.784501 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities" (OuterVolumeSpecName: "utilities") pod "15781540-68d3-4eb4-8966-eff30e2ebd33" (UID: "15781540-68d3-4eb4-8966-eff30e2ebd33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.785084 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.819283 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15781540-68d3-4eb4-8966-eff30e2ebd33" (UID: "15781540-68d3-4eb4-8966-eff30e2ebd33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:54:54 crc kubenswrapper[4891]: I0929 10:54:54.887185 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15781540-68d3-4eb4-8966-eff30e2ebd33-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.240542 4891 generic.go:334] "Generic (PLEG): container finished" podID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerID="5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e" exitCode=0 Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.240597 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxsmx" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.240609 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerDied","Data":"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e"} Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.251407 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxsmx" event={"ID":"15781540-68d3-4eb4-8966-eff30e2ebd33","Type":"ContainerDied","Data":"2b5bfb0ec13f2cc32d21c1b9adac68d5ab34e8e21930c49c4bbdadc5c03be8ca"} Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.251437 4891 scope.go:117] "RemoveContainer" containerID="5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.281802 4891 scope.go:117] "RemoveContainer" containerID="5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.311996 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk" (OuterVolumeSpecName: "kube-api-access-q22vk") pod "15781540-68d3-4eb4-8966-eff30e2ebd33" (UID: "15781540-68d3-4eb4-8966-eff30e2ebd33"). InnerVolumeSpecName "kube-api-access-q22vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.359446 4891 scope.go:117] "RemoveContainer" containerID="ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.395188 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22vk\" (UniqueName: \"kubernetes.io/projected/15781540-68d3-4eb4-8966-eff30e2ebd33-kube-api-access-q22vk\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.397240 4891 scope.go:117] "RemoveContainer" containerID="5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e" Sep 29 10:54:55 crc kubenswrapper[4891]: E0929 10:54:55.398113 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e\": container with ID starting with 5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e not found: ID does not exist" containerID="5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.398142 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e"} err="failed to get container status \"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e\": rpc error: code = NotFound desc = could not find container \"5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e\": container with ID starting with 5c9540f66b6e2d4bdacc17e967a975811705402346e8fc0937d1d5089411300e not found: ID does not exist" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.398161 4891 scope.go:117] "RemoveContainer" containerID="5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0" Sep 29 10:54:55 crc kubenswrapper[4891]: E0929 10:54:55.399087 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0\": container with ID starting with 5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0 not found: ID does not exist" containerID="5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.399193 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0"} err="failed to get container status \"5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0\": rpc error: code = NotFound desc = could not find container \"5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0\": container with ID starting with 5ef6c2173f73e7ef028f69a0e24855282aee69a54da2cdd26aab6887fe76fde0 not found: ID does not exist" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.399212 4891 scope.go:117] "RemoveContainer" containerID="ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a" Sep 29 10:54:55 crc kubenswrapper[4891]: E0929 10:54:55.400660 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a\": container with ID starting with ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a not found: ID does not exist" containerID="ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.400688 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a"} err="failed to get container status \"ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a\": rpc error: code = NotFound desc = could not find container \"ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a\": container with ID starting with ec21a5cc0d492e3627ce518b71c11e2d06d37d98c34b10a726d8ff08c105cd0a not found: ID does not exist" Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.592073 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:55 crc kubenswrapper[4891]: I0929 10:54:55.602471 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxsmx"] Sep 29 10:54:56 crc kubenswrapper[4891]: I0929 10:54:56.405512 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" path="/var/lib/kubelet/pods/15781540-68d3-4eb4-8966-eff30e2ebd33/volumes" Sep 29 10:54:59 crc kubenswrapper[4891]: I0929 10:54:59.979399 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bspfg_570d72c8-d4ed-4b0a-876a-5a942b32a958/control-plane-machine-set-operator/0.log" Sep 29 10:55:00 crc kubenswrapper[4891]: I0929 10:55:00.142084 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hjlz_fa68a099-1736-4f9a-bcaf-9840257afaeb/kube-rbac-proxy/0.log" Sep 29 10:55:00 crc kubenswrapper[4891]: I0929 10:55:00.160111 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4hjlz_fa68a099-1736-4f9a-bcaf-9840257afaeb/machine-api-operator/0.log" Sep 29 10:55:13 crc kubenswrapper[4891]: I0929 10:55:13.218027 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xxtv8_10df5ae8-eb89-4efd-8877-6a87a962fbe7/cert-manager-controller/0.log" Sep 29 10:55:13 crc kubenswrapper[4891]: I0929 10:55:13.345423 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-dtxr6_32062242-b85f-4c38-a6dd-5701216a7a26/cert-manager-cainjector/0.log" Sep 29 10:55:13 crc kubenswrapper[4891]: I0929 10:55:13.435477 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sf94x_b1de57da-fef3-4c24-a501-7f14e9973be9/cert-manager-webhook/0.log" Sep 29 10:55:25 crc kubenswrapper[4891]: I0929 10:55:25.634830 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-dfnnl_33746ad6-4439-446b-bea7-2797ca5a9c37/nmstate-console-plugin/0.log" Sep 29 10:55:25 crc kubenswrapper[4891]: I0929 10:55:25.735783 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hkgzc_393d2298-0458-4346-bfe0-d492fb362511/nmstate-handler/0.log" Sep 29 10:55:25 crc kubenswrapper[4891]: I0929 10:55:25.848607 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7mggl_45398cc7-ef38-4555-befb-ac59051493ed/kube-rbac-proxy/0.log" Sep 29 10:55:25 crc kubenswrapper[4891]: I0929 10:55:25.850875 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7mggl_45398cc7-ef38-4555-befb-ac59051493ed/nmstate-metrics/0.log" Sep 29 10:55:26 crc kubenswrapper[4891]: I0929 10:55:26.012842 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qns5g_08922d5d-a7ec-41c0-9085-bcc17847df78/nmstate-operator/0.log" Sep 29 10:55:26 crc kubenswrapper[4891]: I0929 10:55:26.088925 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-h5mfq_0cde15e7-98b3-44c6-9d10-927909f5f269/nmstate-webhook/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.267995 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-2q7w2_105a82c3-b488-41fb-a511-69b3c239dbd2/kube-rbac-proxy/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.385134 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-2q7w2_105a82c3-b488-41fb-a511-69b3c239dbd2/controller/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.468576 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.633154 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.635011 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.671281 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.679839 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.851288 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.866156 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.866890 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:55:41 crc kubenswrapper[4891]: I0929 10:55:41.869824 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.049997 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-metrics/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.050650 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-frr-files/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.069871 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/controller/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.089041 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/cp-reloader/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.259566 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/kube-rbac-proxy/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.265352 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/frr-metrics/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.283534 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/kube-rbac-proxy-frr/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.449982 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/reloader/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.526761 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bb72s_88e3267b-49e6-443d-8cc6-285a983b44ec/frr-k8s-webhook-server/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.725587 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-8dcfb8c5d-mbwfg_8a6839a2-c048-442c-a761-c6c1adec39a2/manager/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.880420 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-679f568586-f4xqz_4ef2ef1c-41d9-4878-9d58-f04f8dc07b2f/webhook-server/0.log" Sep 29 10:55:42 crc kubenswrapper[4891]: I0929 10:55:42.940486 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wlhcc_eec47c49-2fdd-4eba-aca2-438041840948/kube-rbac-proxy/0.log" Sep 29 10:55:43 crc kubenswrapper[4891]: I0929 10:55:43.464765 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wlhcc_eec47c49-2fdd-4eba-aca2-438041840948/speaker/0.log" Sep 29 10:55:43 crc kubenswrapper[4891]: I0929 10:55:43.631163 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8hs8n_e14026d1-17ce-4f77-b28a-274da74a5c15/frr/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.063603 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.230331 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.254324 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.283522 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.420610 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/pull/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.436600 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/extract/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.445209 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcrc8d8_011f8a2f-1062-4b33-8244-235930966cf1/util/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.610187 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.758823 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.808181 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.809852 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:55:56 crc kubenswrapper[4891]: I0929 10:55:56.971556 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-utilities/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.001848 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/extract-content/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.191533 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.418221 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.442174 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.478898 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.616582 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxqvf_c751fcd1-3522-4572-a3f1-52acfab7c45d/registry-server/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.674483 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-content/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.676589 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/extract-utilities/0.log" Sep 29 10:55:57 crc kubenswrapper[4891]: I0929 10:55:57.911325 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.110686 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.156252 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.158403 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.406418 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5txtz_33c8f323-70e1-4e60-aeb7-f512c245885e/registry-server/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.417342 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/extract/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.419823 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/pull/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.441448 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96vs4p6_acde17bc-adb2-4193-a40f-d9a062f4f67a/util/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.585893 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfmkm_bcfef239-c4e7-43c6-92f3-2092cd28922b/marketplace-operator/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.631921 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.747274 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.760162 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.829238 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.969295 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-content/0.log" Sep 29 10:55:58 crc kubenswrapper[4891]: I0929 10:55:58.992486 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/extract-utilities/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.107996 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j656v_5d92c0d6-f7a3-4ff2-9efb-400745c4f7fe/registry-server/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.125711 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.290783 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.328238 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.339870 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.496355 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-utilities/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.508219 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/extract-content/0.log" Sep 29 10:55:59 crc kubenswrapper[4891]: I0929 10:55:59.980083 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sn22t_5c2eff33-cdda-491e-a057-a6b1e0a2bd10/registry-server/0.log" Sep 29 10:56:31 crc kubenswrapper[4891]: E0929 10:56:31.755461 4891 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:37146->38.102.83.151:34867: write tcp 38.102.83.151:37146->38.102.83.151:34867: write: broken pipe Sep 29 10:57:06 crc kubenswrapper[4891]: I0929 10:57:06.186283 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:57:06 crc kubenswrapper[4891]: I0929 10:57:06.186864 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.300115 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ljw8"] Sep 29 10:57:30 crc kubenswrapper[4891]: E0929 10:57:30.301699 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="extract-content" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.301725 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="extract-content" Sep 29 10:57:30 crc kubenswrapper[4891]: E0929 10:57:30.301765 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="registry-server" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.301779 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="registry-server" Sep 29 10:57:30 crc kubenswrapper[4891]: E0929 10:57:30.301834 4891 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="extract-utilities" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.301851 4891 state_mem.go:107] "Deleted CPUSet assignment" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="extract-utilities" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.302220 4891 memory_manager.go:354] "RemoveStaleState removing state" podUID="15781540-68d3-4eb4-8966-eff30e2ebd33" containerName="registry-server" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.305079 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.314932 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ljw8"] Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.374268 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-utilities\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.374376 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7qrs\" (UniqueName: \"kubernetes.io/projected/c9c601d1-3817-4a88-9070-abe89df84c32-kube-api-access-w7qrs\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.374461 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-catalog-content\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.475962 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7qrs\" (UniqueName: \"kubernetes.io/projected/c9c601d1-3817-4a88-9070-abe89df84c32-kube-api-access-w7qrs\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.476074 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-catalog-content\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.476261 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-utilities\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.476718 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-utilities\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.477022 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c601d1-3817-4a88-9070-abe89df84c32-catalog-content\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.495518 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7qrs\" (UniqueName: \"kubernetes.io/projected/c9c601d1-3817-4a88-9070-abe89df84c32-kube-api-access-w7qrs\") pod \"redhat-operators-5ljw8\" (UID: \"c9c601d1-3817-4a88-9070-abe89df84c32\") " pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:30 crc kubenswrapper[4891]: I0929 10:57:30.646547 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:31 crc kubenswrapper[4891]: I0929 10:57:31.103932 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ljw8"] Sep 29 10:57:31 crc kubenswrapper[4891]: I0929 10:57:31.783896 4891 generic.go:334] "Generic (PLEG): container finished" podID="c9c601d1-3817-4a88-9070-abe89df84c32" containerID="7bdcad14cd64a5c1a5cbf87aa7487161e46802c03aeccfc719c09d5c58cec604" exitCode=0 Sep 29 10:57:31 crc kubenswrapper[4891]: I0929 10:57:31.783939 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ljw8" event={"ID":"c9c601d1-3817-4a88-9070-abe89df84c32","Type":"ContainerDied","Data":"7bdcad14cd64a5c1a5cbf87aa7487161e46802c03aeccfc719c09d5c58cec604"} Sep 29 10:57:31 crc kubenswrapper[4891]: I0929 10:57:31.783964 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ljw8" event={"ID":"c9c601d1-3817-4a88-9070-abe89df84c32","Type":"ContainerStarted","Data":"fab490942a13d0f10cacd0b72209a082796ab8542cf2c6d0696ca8b50b717e90"} Sep 29 10:57:31 crc kubenswrapper[4891]: I0929 10:57:31.787905 4891 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.067646 4891 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.070552 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.097899 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.186001 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.186057 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.196771 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.196848 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.196980 4891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbt9t\" (UniqueName: \"kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.299347 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.299441 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbt9t\" (UniqueName: \"kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.299531 4891 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.299941 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.299950 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.319311 4891 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbt9t\" (UniqueName: \"kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t\") pod \"community-operators-wq4b7\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:36 crc kubenswrapper[4891]: I0929 10:57:36.400561 4891 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:38 crc kubenswrapper[4891]: I0929 10:57:38.847431 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ljw8" event={"ID":"c9c601d1-3817-4a88-9070-abe89df84c32","Type":"ContainerStarted","Data":"2317bb7babc8a79b5ae57f4c392440af5c6fe3ecb6dd604ac86602974018eda0"} Sep 29 10:57:39 crc kubenswrapper[4891]: I0929 10:57:39.004204 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:39 crc kubenswrapper[4891]: I0929 10:57:39.860472 4891 generic.go:334] "Generic (PLEG): container finished" podID="c9c601d1-3817-4a88-9070-abe89df84c32" containerID="2317bb7babc8a79b5ae57f4c392440af5c6fe3ecb6dd604ac86602974018eda0" exitCode=0 Sep 29 10:57:39 crc kubenswrapper[4891]: I0929 10:57:39.860547 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ljw8" event={"ID":"c9c601d1-3817-4a88-9070-abe89df84c32","Type":"ContainerDied","Data":"2317bb7babc8a79b5ae57f4c392440af5c6fe3ecb6dd604ac86602974018eda0"} Sep 29 10:57:39 crc kubenswrapper[4891]: I0929 10:57:39.863625 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerStarted","Data":"69309fc0121e9dde1f0d4eeef44254d7459f1fe0e13ca3564f606d5616be5e4b"} Sep 29 10:57:40 crc kubenswrapper[4891]: I0929 10:57:40.873946 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerStarted","Data":"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0"} Sep 29 10:57:41 crc kubenswrapper[4891]: I0929 10:57:41.886853 4891 generic.go:334] "Generic (PLEG): container finished" podID="7a17f74f-11a4-436d-ad17-59331bc498da" containerID="9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0" exitCode=0 Sep 29 10:57:41 crc kubenswrapper[4891]: I0929 10:57:41.886903 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerDied","Data":"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0"} Sep 29 10:57:43 crc kubenswrapper[4891]: I0929 10:57:43.907301 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ljw8" event={"ID":"c9c601d1-3817-4a88-9070-abe89df84c32","Type":"ContainerStarted","Data":"63295b07afee60bb9fcf7120f0d305cd37ce2ae0638d7ff638253d2730076930"} Sep 29 10:57:43 crc kubenswrapper[4891]: I0929 10:57:43.909405 4891 generic.go:334] "Generic (PLEG): container finished" podID="7a17f74f-11a4-436d-ad17-59331bc498da" containerID="adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c" exitCode=0 Sep 29 10:57:43 crc kubenswrapper[4891]: I0929 10:57:43.909444 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerDied","Data":"adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c"} Sep 29 10:57:43 crc kubenswrapper[4891]: I0929 10:57:43.929492 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ljw8" podStartSLOduration=2.685623416 podStartE2EDuration="13.929477275s" podCreationTimestamp="2025-09-29 10:57:30 +0000 UTC" firstStartedPulling="2025-09-29 10:57:31.787615811 +0000 UTC m=+4181.992784132" lastFinishedPulling="2025-09-29 10:57:43.03146963 +0000 UTC m=+4193.236637991" observedRunningTime="2025-09-29 10:57:43.927054746 +0000 UTC m=+4194.132223067" watchObservedRunningTime="2025-09-29 10:57:43.929477275 +0000 UTC m=+4194.134645596" Sep 29 10:57:44 crc kubenswrapper[4891]: I0929 10:57:44.921672 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerStarted","Data":"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af"} Sep 29 10:57:44 crc kubenswrapper[4891]: I0929 10:57:44.944364 4891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wq4b7" podStartSLOduration=6.4844190600000005 podStartE2EDuration="8.944342978s" podCreationTimestamp="2025-09-29 10:57:36 +0000 UTC" firstStartedPulling="2025-09-29 10:57:41.892539884 +0000 UTC m=+4192.097708215" lastFinishedPulling="2025-09-29 10:57:44.352463812 +0000 UTC m=+4194.557632133" observedRunningTime="2025-09-29 10:57:44.935284798 +0000 UTC m=+4195.140453139" watchObservedRunningTime="2025-09-29 10:57:44.944342978 +0000 UTC m=+4195.149511309" Sep 29 10:57:46 crc kubenswrapper[4891]: I0929 10:57:46.411702 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:46 crc kubenswrapper[4891]: I0929 10:57:46.412050 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:46 crc kubenswrapper[4891]: I0929 10:57:46.465461 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:50 crc kubenswrapper[4891]: I0929 10:57:50.648017 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:50 crc kubenswrapper[4891]: I0929 10:57:50.649770 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:57:51 crc kubenswrapper[4891]: I0929 10:57:51.704615 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ljw8" podUID="c9c601d1-3817-4a88-9070-abe89df84c32" containerName="registry-server" probeResult="failure" output=< Sep 29 10:57:51 crc kubenswrapper[4891]: timeout: failed to connect service ":50051" within 1s Sep 29 10:57:51 crc kubenswrapper[4891]: > Sep 29 10:57:56 crc kubenswrapper[4891]: I0929 10:57:56.457470 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:56 crc kubenswrapper[4891]: I0929 10:57:56.503819 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.050708 4891 generic.go:334] "Generic (PLEG): container finished" podID="f5c7b77f-4cad-42de-be3d-cba2ad258c6c" containerID="7491fd57c1e0c95191b72a343f7ec9434e0acf4fd1f2847c9b52c17bfb48f1c9" exitCode=0 Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.050819 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" event={"ID":"f5c7b77f-4cad-42de-be3d-cba2ad258c6c","Type":"ContainerDied","Data":"7491fd57c1e0c95191b72a343f7ec9434e0acf4fd1f2847c9b52c17bfb48f1c9"} Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.050962 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wq4b7" podUID="7a17f74f-11a4-436d-ad17-59331bc498da" containerName="registry-server" containerID="cri-o://902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af" gracePeriod=2 Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.051542 4891 scope.go:117] "RemoveContainer" containerID="7491fd57c1e0c95191b72a343f7ec9434e0acf4fd1f2847c9b52c17bfb48f1c9" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.184488 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9ww5r_must-gather-rt8g7_f5c7b77f-4cad-42de-be3d-cba2ad258c6c/gather/0.log" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.531015 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.680262 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbt9t\" (UniqueName: \"kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t\") pod \"7a17f74f-11a4-436d-ad17-59331bc498da\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.680627 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities\") pod \"7a17f74f-11a4-436d-ad17-59331bc498da\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.680708 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content\") pod \"7a17f74f-11a4-436d-ad17-59331bc498da\" (UID: \"7a17f74f-11a4-436d-ad17-59331bc498da\") " Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.682010 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities" (OuterVolumeSpecName: "utilities") pod "7a17f74f-11a4-436d-ad17-59331bc498da" (UID: "7a17f74f-11a4-436d-ad17-59331bc498da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.688715 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t" (OuterVolumeSpecName: "kube-api-access-qbt9t") pod "7a17f74f-11a4-436d-ad17-59331bc498da" (UID: "7a17f74f-11a4-436d-ad17-59331bc498da"). InnerVolumeSpecName "kube-api-access-qbt9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.726357 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a17f74f-11a4-436d-ad17-59331bc498da" (UID: "7a17f74f-11a4-436d-ad17-59331bc498da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.782651 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbt9t\" (UniqueName: \"kubernetes.io/projected/7a17f74f-11a4-436d-ad17-59331bc498da-kube-api-access-qbt9t\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.782685 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:57 crc kubenswrapper[4891]: I0929 10:57:57.782700 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17f74f-11a4-436d-ad17-59331bc498da-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.063553 4891 generic.go:334] "Generic (PLEG): container finished" podID="7a17f74f-11a4-436d-ad17-59331bc498da" containerID="902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af" exitCode=0 Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.063636 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerDied","Data":"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af"} Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.063921 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4b7" event={"ID":"7a17f74f-11a4-436d-ad17-59331bc498da","Type":"ContainerDied","Data":"69309fc0121e9dde1f0d4eeef44254d7459f1fe0e13ca3564f606d5616be5e4b"} Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.063941 4891 scope.go:117] "RemoveContainer" containerID="902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.063657 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4b7" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.100471 4891 scope.go:117] "RemoveContainer" containerID="adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.123948 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.131963 4891 scope.go:117] "RemoveContainer" containerID="9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.134589 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wq4b7"] Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.173517 4891 scope.go:117] "RemoveContainer" containerID="902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af" Sep 29 10:57:58 crc kubenswrapper[4891]: E0929 10:57:58.173970 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af\": container with ID starting with 902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af not found: ID does not exist" containerID="902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.174014 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af"} err="failed to get container status \"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af\": rpc error: code = NotFound desc = could not find container \"902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af\": container with ID starting with 902f65c5df194ca49ba62ba71a47eb51b3eace7ff1ff7030879d2a0dc94eb6af not found: ID does not exist" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.174057 4891 scope.go:117] "RemoveContainer" containerID="adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c" Sep 29 10:57:58 crc kubenswrapper[4891]: E0929 10:57:58.174485 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c\": container with ID starting with adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c not found: ID does not exist" containerID="adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.174508 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c"} err="failed to get container status \"adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c\": rpc error: code = NotFound desc = could not find container \"adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c\": container with ID starting with adb49bb6b4b8d4069a410aae9d97516d4cdf6e3e5aa0b1a909e3b9af24dd5a0c not found: ID does not exist" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.174521 4891 scope.go:117] "RemoveContainer" containerID="9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0" Sep 29 10:57:58 crc kubenswrapper[4891]: E0929 10:57:58.175064 4891 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0\": container with ID starting with 9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0 not found: ID does not exist" containerID="9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.175109 4891 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0"} err="failed to get container status \"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0\": rpc error: code = NotFound desc = could not find container \"9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0\": container with ID starting with 9821d775e590a7602ec7d1c1690e2d9c1b1a17d560162acd801db3e13a0444f0 not found: ID does not exist" Sep 29 10:57:58 crc kubenswrapper[4891]: I0929 10:57:58.409657 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a17f74f-11a4-436d-ad17-59331bc498da" path="/var/lib/kubelet/pods/7a17f74f-11a4-436d-ad17-59331bc498da/volumes" Sep 29 10:58:01 crc kubenswrapper[4891]: I0929 10:58:01.708580 4891 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ljw8" podUID="c9c601d1-3817-4a88-9070-abe89df84c32" containerName="registry-server" probeResult="failure" output=< Sep 29 10:58:01 crc kubenswrapper[4891]: timeout: failed to connect service ":50051" within 1s Sep 29 10:58:01 crc kubenswrapper[4891]: > Sep 29 10:58:06 crc kubenswrapper[4891]: I0929 10:58:06.186353 4891 patch_prober.go:28] interesting pod/machine-config-daemon-gb8tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:58:06 crc kubenswrapper[4891]: I0929 10:58:06.187247 4891 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:58:06 crc kubenswrapper[4891]: I0929 10:58:06.187362 4891 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" Sep 29 10:58:06 crc kubenswrapper[4891]: I0929 10:58:06.188187 4891 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2da827005486a0a15bbdd7c43a77ee70ba395e814d8c72e53cadb7aaabd42575"} pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:58:06 crc kubenswrapper[4891]: I0929 10:58:06.188250 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" podUID="582de198-5a15-4c4c-aaea-881c638a42ac" containerName="machine-config-daemon" containerID="cri-o://2da827005486a0a15bbdd7c43a77ee70ba395e814d8c72e53cadb7aaabd42575" gracePeriod=600 Sep 29 10:58:07 crc kubenswrapper[4891]: I0929 10:58:07.165011 4891 generic.go:334] "Generic (PLEG): container finished" podID="582de198-5a15-4c4c-aaea-881c638a42ac" containerID="2da827005486a0a15bbdd7c43a77ee70ba395e814d8c72e53cadb7aaabd42575" exitCode=0 Sep 29 10:58:07 crc kubenswrapper[4891]: I0929 10:58:07.165191 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerDied","Data":"2da827005486a0a15bbdd7c43a77ee70ba395e814d8c72e53cadb7aaabd42575"} Sep 29 10:58:07 crc kubenswrapper[4891]: I0929 10:58:07.165364 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gb8tp" event={"ID":"582de198-5a15-4c4c-aaea-881c638a42ac","Type":"ContainerStarted","Data":"a335b74c68dd933b06611a3cccfee0dad5b86e25086ea64e289a1242f77ee740"} Sep 29 10:58:07 crc kubenswrapper[4891]: I0929 10:58:07.165388 4891 scope.go:117] "RemoveContainer" containerID="a9e5e6200fb86051d115ba9bec6e66879cd0bcc43639b96bfba99ed67ede89f4" Sep 29 10:58:09 crc kubenswrapper[4891]: I0929 10:58:09.785323 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9ww5r/must-gather-rt8g7"] Sep 29 10:58:09 crc kubenswrapper[4891]: I0929 10:58:09.786197 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" podUID="f5c7b77f-4cad-42de-be3d-cba2ad258c6c" containerName="copy" containerID="cri-o://05b463472d2fba6c06a5111ca4bb0e542a5a6c9cad188d589818fa0cf363e92e" gracePeriod=2 Sep 29 10:58:09 crc kubenswrapper[4891]: I0929 10:58:09.794746 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9ww5r/must-gather-rt8g7"] Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.211339 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9ww5r_must-gather-rt8g7_f5c7b77f-4cad-42de-be3d-cba2ad258c6c/copy/0.log" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.212005 4891 generic.go:334] "Generic (PLEG): container finished" podID="f5c7b77f-4cad-42de-be3d-cba2ad258c6c" containerID="05b463472d2fba6c06a5111ca4bb0e542a5a6c9cad188d589818fa0cf363e92e" exitCode=143 Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.212054 4891 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c89c3f86bcb26e07e14be9826cc709b854fbb420f3cd1268c28b647a29b78396" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.214272 4891 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9ww5r_must-gather-rt8g7_f5c7b77f-4cad-42de-be3d-cba2ad258c6c/copy/0.log" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.215023 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.324808 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output\") pod \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.324848 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f2kl\" (UniqueName: \"kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl\") pod \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\" (UID: \"f5c7b77f-4cad-42de-be3d-cba2ad258c6c\") " Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.351269 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl" (OuterVolumeSpecName: "kube-api-access-7f2kl") pod "f5c7b77f-4cad-42de-be3d-cba2ad258c6c" (UID: "f5c7b77f-4cad-42de-be3d-cba2ad258c6c"). InnerVolumeSpecName "kube-api-access-7f2kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.428094 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f2kl\" (UniqueName: \"kubernetes.io/projected/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-kube-api-access-7f2kl\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.515411 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f5c7b77f-4cad-42de-be3d-cba2ad258c6c" (UID: "f5c7b77f-4cad-42de-be3d-cba2ad258c6c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.530014 4891 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c7b77f-4cad-42de-be3d-cba2ad258c6c-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.702577 4891 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.751083 4891 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ljw8" Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.834359 4891 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ljw8"] Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.943008 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 10:58:10 crc kubenswrapper[4891]: I0929 10:58:10.944917 4891 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sn22t" podUID="5c2eff33-cdda-491e-a057-a6b1e0a2bd10" containerName="registry-server" containerID="cri-o://21c28b138e5f0fe7c754537e30c5308cd9442f2dec2a7b504ee92b3ff0ef252f" gracePeriod=2 Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.223356 4891 generic.go:334] "Generic (PLEG): container finished" podID="5c2eff33-cdda-491e-a057-a6b1e0a2bd10" containerID="21c28b138e5f0fe7c754537e30c5308cd9442f2dec2a7b504ee92b3ff0ef252f" exitCode=0 Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.223427 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerDied","Data":"21c28b138e5f0fe7c754537e30c5308cd9442f2dec2a7b504ee92b3ff0ef252f"} Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.223665 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9ww5r/must-gather-rt8g7" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.415762 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.551159 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content\") pod \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.551361 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pl9v\" (UniqueName: \"kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v\") pod \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.551419 4891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities\") pod \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\" (UID: \"5c2eff33-cdda-491e-a057-a6b1e0a2bd10\") " Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.553240 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities" (OuterVolumeSpecName: "utilities") pod "5c2eff33-cdda-491e-a057-a6b1e0a2bd10" (UID: "5c2eff33-cdda-491e-a057-a6b1e0a2bd10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.561974 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v" (OuterVolumeSpecName: "kube-api-access-4pl9v") pod "5c2eff33-cdda-491e-a057-a6b1e0a2bd10" (UID: "5c2eff33-cdda-491e-a057-a6b1e0a2bd10"). InnerVolumeSpecName "kube-api-access-4pl9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.654325 4891 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.654652 4891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pl9v\" (UniqueName: \"kubernetes.io/projected/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-kube-api-access-4pl9v\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.665371 4891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c2eff33-cdda-491e-a057-a6b1e0a2bd10" (UID: "5c2eff33-cdda-491e-a057-a6b1e0a2bd10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:58:11 crc kubenswrapper[4891]: I0929 10:58:11.756753 4891 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2eff33-cdda-491e-a057-a6b1e0a2bd10-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.232503 4891 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn22t" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.232539 4891 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn22t" event={"ID":"5c2eff33-cdda-491e-a057-a6b1e0a2bd10","Type":"ContainerDied","Data":"ae04ef6659973c8881039a413d05c2718d76ff56a1cd1498b683f705f80a111e"} Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.232585 4891 scope.go:117] "RemoveContainer" containerID="21c28b138e5f0fe7c754537e30c5308cd9442f2dec2a7b504ee92b3ff0ef252f" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.284993 4891 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.294450 4891 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sn22t"] Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.298768 4891 scope.go:117] "RemoveContainer" containerID="e4d0ec582572cb473093b9135646d51b37ddaf22299a23c14bbdf128428f881e" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.322211 4891 scope.go:117] "RemoveContainer" containerID="9f05eea26b9a8b8464c79abd1be381a73b2abeb9ae7a2c6e04833cd56107400d" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.406836 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2eff33-cdda-491e-a057-a6b1e0a2bd10" path="/var/lib/kubelet/pods/5c2eff33-cdda-491e-a057-a6b1e0a2bd10/volumes" Sep 29 10:58:12 crc kubenswrapper[4891]: I0929 10:58:12.408086 4891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c7b77f-4cad-42de-be3d-cba2ad258c6c" path="/var/lib/kubelet/pods/f5c7b77f-4cad-42de-be3d-cba2ad258c6c/volumes" Sep 29 10:59:18 crc kubenswrapper[4891]: I0929 10:59:18.965909 4891 scope.go:117] "RemoveContainer" containerID="7491fd57c1e0c95191b72a343f7ec9434e0acf4fd1f2847c9b52c17bfb48f1c9" Sep 29 10:59:19 crc kubenswrapper[4891]: I0929 10:59:19.023886 4891 scope.go:117] "RemoveContainer" containerID="05b463472d2fba6c06a5111ca4bb0e542a5a6c9cad188d589818fa0cf363e92e"